Hamiltonian Adaptive Importance Sampling

Ali Mousavi, Reza Monsefi, Victor Elvira

Research output: Contribution to journalArticlepeer-review

Abstract

Importance sampling (IS) is a powerful Monte Carlo
(MC) methodology for approximating integrals, for instance
in the context of Bayesian inference. In IS, the samples are
simulated from the so-called proposal distribution, and the choice
of this proposal is key for achieving a high performance. In
adaptive IS (AIS) methods, a set of proposals is iteratively
improved. AIS is a relevant and timely methodology although
many limitations remain yet to be overcome, e.g., the curse of
dimensionality in high-dimensional and multi-modal problems.
Moreover, the Hamiltonian Monte Carlo (HMC) algorithm has
become increasingly popular in machine learning and statistics.
HMC has several appealing features such as its exploratory
behavior, especially in high-dimensional targets, when other
methods suffer. In this paper, we introduce the novel Hamiltonian
adaptive importance sampling (HAIS) method. HAIS implements
a two-step adaptive process with parallel HMC chains that
cooperate at each iteration. The proposed HAIS efficiently
adapts a population of proposals, extracting the advantages of
HMC. HAIS can be understood as a particular instance of
the generic layered AIS family with an additional resampling
step. HAIS achieves a significant performance improvement in
high-dimensional problems w.r.t. state-of-the-art algorithms. We
discuss the statistical properties of HAIS and show its high
performance in two challenging examples.
Original languageEnglish
Pages (from-to)713 - 717
Number of pages5
JournalIEEE Signal Processing Letters
Volume28
DOIs
Publication statusPublished - 26 Mar 2021

Fingerprint

Dive into the research topics of 'Hamiltonian Adaptive Importance Sampling'. Together they form a unique fingerprint.

Cite this