Sequential Neural Methods for Likelihood-free Inference

Conor Durkan, George Papamakarios, Iain Murray

Research output: Contribution to conferencePaperpeer-review

Abstract

Likelihood-free inference refers to inference when a likelihood function cannot be explicitly evaluated, which is often the case for models based on simulators. While much of the literature is concerned with sample-based ‘Approximate Bayesian Computation’ methods, recent work suggests that approaches relying on deep neural conditional density estimators can obtain state-of-the-art results with fewer simulations. The neural approaches vary in how they choose which simulations to run and what they learn: an approximate posterior or a surrogate likelihood. This work provides some direct controlled comparisons between these choices.
Original languageEnglish
Pages1-9
Number of pages9
Publication statusPublished - 2018
EventThird workshop on Bayesian Deep Learning 2018 - Montréal, Canada
Duration: 7 Dec 20187 Dec 2018
http://bayesiandeeplearning.org/

Conference

ConferenceThird workshop on Bayesian Deep Learning 2018
Abbreviated titleNIPS 2018 Workshop
Country/TerritoryCanada
CityMontréal
Period7/12/187/12/18
Internet address

Fingerprint

Dive into the research topics of 'Sequential Neural Methods for Likelihood-free Inference'. Together they form a unique fingerprint.

Cite this