Nonasymptotic analysis of Stochastic Gradient Hamiltonian Monte Carlo under local conditions for nonconvex optimization

Ömer Deniz Akyildiz, Sotirios Sabanis

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

We provide a nonasymptotic analysis of the convergence of the stochastic gradient Hamiltonian Monte Carlo (SGHMC) to a target measure in Wasserstein-2 distance without assuming log-concavity. By making the dimension dependence explicit, we provide a uniform convergence rate of order $\mathcal{O}(\eta^{1/4} )$, where $\eta$ is the step-size. Our results shed light onto the performance of the SGHMC methods compared to their overdamped counterparts, e.g., stochastic gradient Langevin dynamics (SGLD). Furthermore, our results also imply that the SGHMC, when viewed as a nonconvex optimizer, converges to a global minimum with the best known rates.
Original languageEnglish
Pages (from-to)1-34
JournalJournal of Machine Learning Research
Volume25
Issue number113
Publication statusPublished - 31 Jan 2024

Keywords / Materials (for Non-textual outputs)

  • math.OC
  • stat.CO
  • stat.ML

Fingerprint

Dive into the research topics of 'Nonasymptotic analysis of Stochastic Gradient Hamiltonian Monte Carlo under local conditions for nonconvex optimization'. Together they form a unique fingerprint.

Cite this