Is Learning Summary Statistics Necessary for Likelihood-free Inference?

Yanzhi Chen, Michael U. Gutmann, Adrian Weller

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

Likelihood-free inference (LFI) is a set of techniques for inference in implicit statistical models. A longstanding question in LFI has been how to design or learn good summary statistics of data, but this might now seem unnecessary due to the advent of recent end-to-end (i.e. neural network-based) LFI methods. In this work, we rethink this question with a new method for learning summary statistics. We show that learning sufficient statistics may be easier than direct posterior inference, as the former problem can be reduced to a set of low-dimensional, easy-to-solve learning problems. This suggests us to explicitly decouple summary statistics learning from posterior inference in LFI. Experiments on diverse inference tasks with different data types validate our hypothesis.
Original languageEnglish
Title of host publicationProceedings of the 40th International Conference on Machine Learning
PublisherPMLR
Pages4529-4544
Number of pages16
Volume202
Publication statusPublished - 10 Jul 2023
EventThe Fortieth International Conference on Machine Learning - Honolulu, United States
Duration: 23 Jul 202329 Jul 2023
Conference number: 40
https://icml.cc/

Publication series

NameProceedings of Machine Learning Research
PublisherMLResearchPress
ISSN (Electronic)2640-3498

Conference

ConferenceThe Fortieth International Conference on Machine Learning
Abbreviated titleICML 2023
Country/TerritoryUnited States
CityHonolulu
Period23/07/2329/07/23
Internet address

Fingerprint

Dive into the research topics of 'Is Learning Summary Statistics Necessary for Likelihood-free Inference?'. Together they form a unique fingerprint.

Cite this