Using the Nyström Method to Speed Up Kernel Machines

Research output: Chapter in Book/Report/Conference proceedingConference contribution


A major problem for kernel-based predictors (such as Support Vector Machines and Gaussian processes) is that the amount of computation required to find the solution scales as O(n ), where n is the number of training examples. We show that an approximation to the eigendecomposition of the Gram matrix can be computed by the Nyström method (which is used for the numerical solution of eigenproblems). This is achieved by carrying out an eigendecomposition on a smaller system of size m < n, and then expanding the results back up to n dimensions. The computational complexity of a predictor using this approximation is O(m n). We report experiments on the USPS and abalone data sets and show that we can set m n without any significant decrease in the accuracy of the solution.
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 13 (NIPS 2000)
EditorsT.K. Leen, T.G. Dietterich, V. Tresp
PublisherMIT Press
Number of pages7
Publication statusPublished - 2001

Fingerprint Dive into the research topics of 'Using the Nyström Method to Speed Up Kernel Machines'. Together they form a unique fingerprint.

Cite this