Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows

George Papamakarios, David C Sterratt, Iain Murray

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

We present Sequential Neural Likelihood (SNL), a new method for Bayesian inference in simulator models, where the likelihood is intractable but simulating data from the model is possible. SNL trains an autoregressive flow on simulated data in order to learn a model of the likelihood in the region of high posterior density. A sequential training procedure guides simulations and reduces simulation cost by orders of magnitude. We show that SNL is more robust, more accurate and requires less tuning than related neural-based methods, and we discuss diagnostics for assessing calibration, convergence and goodness-of-fit.
Original languageEnglish
Title of host publicationProceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS) 2019
Place of PublicationNaha, Okinawa, Japan
PublisherPMLR
Pages837-848
Number of pages12
Publication statusPublished - 18 Apr 2019
Event22nd International Conference on Artificial Intelligence and Statistics - Naha, Japan
Duration: 16 Apr 201918 Apr 2019
https://www.aistats.org/

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
Volume89
ISSN (Electronic)2640-3498

Conference

Conference22nd International Conference on Artificial Intelligence and Statistics
Abbreviated titleAISTATS 2019
Country/TerritoryJapan
CityNaha
Period16/04/1918/04/19
Internet address

Fingerprint

Dive into the research topics of 'Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows'. Together they form a unique fingerprint.

Cite this