Higher Order Langevin Monte Carlo Algorithm

Sotirios Sabanis, Ying Zhang

Research output: Contribution to journalArticlepeer-review


A new (unadjusted) Langevin Monte Carlo (LMC) algorithm with improved rates in total variation and in Wasserstein distance is presented. All these are obtained in the context of sampling from a target distribution $\pi$ that has a density on $\mathbb{R}^d$ known up to a normalizing constant. Crucially, the Langevin SDE associated with the target distribution $\pi$ is assumed to have a locally Lipschitz drift coefficient such that its second derivative is locally H\"{o}lder continuous with exponent $\beta \in (0,1]$. Non-asymptotic bounds are obtained for the convergence to stationarity of the new sampling method with convergence rate $1+ \beta/2$ in Wasserstein distance, while it is shown that the rate is 1 in total variation even in the absence of convexity. Finally, in the case of a Lipschitz gradient, explicit constants are provided.
Original languageEnglish
Pages (from-to)3805-3850
Number of pages46
JournalElectronic Journal of Statistics
Issue number2
Early online date3 Oct 2019
Publication statusE-pub ahead of print - 3 Oct 2019


  • math.ST
  • stat.TH
  • 62L10, 65C05 (Primary)

Fingerprint Dive into the research topics of 'Higher Order Langevin Monte Carlo Algorithm'. Together they form a unique fingerprint.

Cite this