On the relationship between online Gaussian process regression and kernel least mean squares algorithms

Steven Van Vaerenbergh, Jesus Fernandez-Bes, Victor Elvira

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We study the relationship between online Gaussian process (GP) regression and kernel least mean squares (KLMS) algorithms. While the latter have no capacity of storing the entire posterior distribution during online learning, we discover that their operation corresponds to the assumption of a fixed posterior covariance that follows a simple parametric model. Interestingly, several well-known KLMS algorithms correspond to specific cases of this model. The probabilistic perspective allows us to understand how each of them handles uncertainty, which could explain some of their performance differences.

Original languageEnglish
Title of host publication2016 IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2016 - Proceedings
EditorsKostas Diamantaras, Aurelio Uncini, Francesco A. N. Palmieri, Jan Larsen
PublisherIEEE Computer Society
Volume2016-November
ISBN (Electronic)9781509007462
ISBN (Print)978-1-5090-0747-9
DOIs
Publication statusPublished - 8 Nov 2016
Event26th IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2016 - Proceedings - Vietri sul Mare, Salerno, Italy
Duration: 13 Sep 201616 Sep 2016

Conference

Conference26th IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2016 - Proceedings
Country/TerritoryItaly
CityVietri sul Mare, Salerno
Period13/09/1616/09/16

Keywords

  • Gaussian processes
  • kernel least-mean squares
  • online learning
  • regression

Fingerprint

Dive into the research topics of 'On the relationship between online Gaussian process regression and kernel least mean squares algorithms'. Together they form a unique fingerprint.

Cite this