Neural Approximate Sufficient Statistics for Implicit Models

Yanzhi Chen, Dinghuai Zhang, Michael U. Gutmann, Aaron Courville, Zhanxing Zhu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

We consider the fundamental problem of how to automatically construct summary statistics for implicit generative models where the evaluation of likelihood function is intractable but sampling / simulating data from the model is possible. The idea is to frame the task of constructing sufficient statistics as learning mutual information maximizing representation of the data. This representation is computed by a deep neural network trained by a joint statistic-posterior learning strategy. We apply our approach to both traditional approximate Bayesian computation (ABC) and recent neural likelihood approaches, boosting their performance on a range of tasks.
Original languageEnglish
Title of host publicationNinth International Conference on Learning Representations (ICLR 2021)
Number of pages14
Publication statusPublished - 4 May 2021
EventNinth International Conference on Learning Representations 2021 - Virtual Conference
Duration: 4 May 20217 May 2021
https://iclr.cc/Conferences/2021/Dates

Conference

ConferenceNinth International Conference on Learning Representations 2021
Abbreviated titleICLR 2021
CityVirtual Conference
Period4/05/217/05/21
Internet address

Keywords / Materials (for Non-textual outputs)

  • likelihood-free inference
  • bayesian inference
  • mutual information
  • representation learning

Fingerprint

Dive into the research topics of 'Neural Approximate Sufficient Statistics for Implicit Models'. Together they form a unique fingerprint.

Cite this