Fast Forward Selection to Speed Up Sparse Gaussian Process Regression

Matthias Seeger, Christopher K. I. Williams, Neil Lawrence

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present a method for the sparse greedy approximation of Bayesian Gaussian process regression, featuring a novel heuristic for very fast forward selection. Our method is essentially as fast as an equivalent one which selects the "support" patterns at random, yet it can outperform random selection on hard curve fitting tasks. More importantly, it leads to a sufficiently stable approximation of the log marginal likelihood of the training data, which can be optimised to adjust a large number of hyperparameters automatically. We demonstrate the model selection capabilities of the algorithm in a range of experiments. In line with the development of our method, we present a simple view on sparse approximations for GP models and their underlying assumptions and show relations to other methods.
Original languageEnglish
Title of host publicationProceedings of the Ninth International Workshop on Artificial Intelligence and Statistics
Number of pages8
Publication statusPublished - 2003

Fingerprint

Dive into the research topics of 'Fast Forward Selection to Speed Up Sparse Gaussian Process Regression'. Together they form a unique fingerprint.

Cite this