Abstract / Description of output
Gaussian processes (GPs) provide a flexible framework for probabilistic regression. The necessary computations involve standard matrix operations. There have been several attempts to accelerate these operations based on fast kernel matrix-vector multiplications. By focussing on the simplest GP computation, corresponding to test-time predictions in kernel ridge regression, we conclude that simple approximations based on clusterings in a kd-tree can never work well for simple regression problems. Analytical expansions can provide speedups, but current implementations are limited to the squared-exponential kernel and low-dimensional problems. We discuss future directions.
Original language | English |
---|---|
Title of host publication | Numerical Mathematics in Machine Learning workshop at the 26th International Conference on Machine Learning (ICML 2009) |
Number of pages | 4 |
Publication status | Published - 2009 |
Event | Numerical Mathematics in Machine Learning workshop at the 26th International Conference on Machine Learning (ICML 2009) - Montreal, Canada Duration: 18 Jun 2009 → … |
Workshop
Workshop | Numerical Mathematics in Machine Learning workshop at the 26th International Conference on Machine Learning (ICML 2009) |
---|---|
Country/Territory | Canada |
City | Montreal |
Period | 18/06/09 → … |