Explicit Stabilised Gradient Descent for Faster Strongly Convex Optimisation

Armin Eftekhari, Bart Vandereycken, Gilles Vilmart, Konstantinos C Zygalakis

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

This paper introduces the Runge-Kutta Chebyshev descent method (RKCD) for strongly convex optimisation problems. This new algorithm is based on explicit stabilised integrators for stiff differential equations, a powerful class of numerical schemes that avoid the severe step size restriction faced by standard explicit integrators. For optimising quadratic and strongly convex functions, this paper proves that RKCD nearly achieves the optimal convergence rate of the conjugate gradient algorithm, and the suboptimality of RKCD diminishes as the condition number of the quadratic function worsens. It is established that this optimal rate is obtained also for a partitioned variant of RKCD applied to perturbations of quadratic functions. In addition, numerical experiments on general strongly convex problems show that RKCD outperforms Nesterov's accelerated gradient descent.
Original languageEnglish
Pages (from-to)119-139
JournalBit numerical mathematics
Volume61
Early online date4 Jul 2020
DOIs
Publication statusPublished - 31 Mar 2021

Fingerprint

Dive into the research topics of 'Explicit Stabilised Gradient Descent for Faster Strongly Convex Optimisation'. Together they form a unique fingerprint.

Cite this