Accelerated Stochastic Matrix Inversion: General Theory and Speeding up BFGS Rules for Faster Second-Order Optimization

Robert M. Gower, Filip Hanzely, Peter Richtárik, Sebastian Stich

Research output: Contribution to conferencePosterpeer-review

Abstract / Description of output

We present the first accelerated randomized algorithm for solving linear systems in Euclidean spaces. One essential problem of this type is the matrix inversion problem. In particular, our algorithm can be specialized to invert positive definite matrices in such a way that all iterates (approximate solutions) generated by the algorithm are positive definite matrices themselves. This opens the way for many applications in the field of optimization and machine learning. As an application of our general theory, we develop the {\em first accelerated (deterministic and stochastic) quasi-Newton updates}. Our updates lead to provably more aggressive approximations of the inverse Hessian, and lead to speed-ups over classical non-accelerated rules in numerical experiments. Experiments with empirical risk minimization show that our rules can accelerate training of machine learning models.
Original languageEnglish
Number of pages11
Publication statusPublished - 12 Feb 2018
EventThirty-second Conference on Neural Information Processing Systems - Montreal, Canada
Duration: 3 Dec 20188 Dec 2018
https://nips.cc/

Conference

ConferenceThirty-second Conference on Neural Information Processing Systems
Abbreviated titleNIPS 2018
Country/TerritoryCanada
CityMontreal
Period3/12/188/12/18
Internet address

Keywords / Materials (for Non-textual outputs)

  • math.OC
  • cs.NA

Fingerprint

Dive into the research topics of 'Accelerated Stochastic Matrix Inversion: General Theory and Speeding up BFGS Rules for Faster Second-Order Optimization'. Together they form a unique fingerprint.

Cite this