Gaussian Processes and Fast Matrix-Vector Multiplies

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

Gaussian processes (GPs) provide a flexible framework for probabilistic regression. The necessary computations involve standard matrix operations. There have been several attempts to accelerate these operations based on fast kernel matrix-vector multiplications. By focussing on the simplest GP computation, corresponding to test-time predictions in kernel ridge regression, we conclude that simple approximations based on clusterings in a kd-tree can never work well for simple regression problems. Analytical expansions can provide speedups, but current implementations are limited to the squared-exponential kernel and low-dimensional problems. We discuss future directions.
Original languageEnglish
Title of host publicationNumerical Mathematics in Machine Learning workshop at the 26th International Conference on Machine Learning (ICML 2009)
Number of pages4
Publication statusPublished - 2009
EventNumerical Mathematics in Machine Learning workshop at the 26th International Conference on Machine Learning (ICML 2009) - Montreal, Canada
Duration: 18 Jun 2009 → …

Workshop

WorkshopNumerical Mathematics in Machine Learning workshop at the 26th International Conference on Machine Learning (ICML 2009)
Country/TerritoryCanada
CityMontreal
Period18/06/09 → …

Fingerprint

Dive into the research topics of 'Gaussian Processes and Fast Matrix-Vector Multiplies'. Together they form a unique fingerprint.

Cite this