Abstract / Description of output
We study the relationship between online Gaussian process (GP) regression and kernel least mean squares (KLMS) algorithms. While the latter have no capacity of storing the entire posterior distribution during online learning, we discover that their operation corresponds to the assumption of a fixed posterior covariance that follows a simple parametric model. Interestingly, several well-known KLMS algorithms correspond to specific cases of this model. The probabilistic perspective allows us to understand how each of them handles uncertainty, which could explain some of their performance differences.
Original language | English |
---|---|
Title of host publication | 2016 IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2016 - Proceedings |
Editors | Kostas Diamantaras, Aurelio Uncini, Francesco A. N. Palmieri, Jan Larsen |
Publisher | IEEE Computer Society |
Volume | 2016-November |
ISBN (Electronic) | 9781509007462 |
ISBN (Print) | 978-1-5090-0747-9 |
DOIs | |
Publication status | Published - 8 Nov 2016 |
Event | 26th IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2016 - Proceedings - Vietri sul Mare, Salerno, Italy Duration: 13 Sept 2016 → 16 Sept 2016 |
Conference
Conference | 26th IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2016 - Proceedings |
---|---|
Country/Territory | Italy |
City | Vietri sul Mare, Salerno |
Period | 13/09/16 → 16/09/16 |
Keywords / Materials (for Non-textual outputs)
- Gaussian processes
- kernel least-mean squares
- online learning
- regression