Abstract / Description of output
Approximate Bayesian computation (ABC) is a set of techniques for Bayesian inference when the likelihood is intractable but sampling from the model is possible. This work presents a simple yet effective ABC algorithm based on the combination of two classical ABC approaches — regression ABC and sequential ABC. The key idea is that rather than learning the posterior directly, we first target another auxiliary distribution that can be learned accurately by existing methods, through which we then subsequently learn the desired posterior with the help of a Gaussian copula. During this process, the complexity of the model changes adaptively according to the data at hand. Experiments on a synthetic dataset as well as three real-world inference tasks demonstrates that the proposed method is fast, accurate, and easy to use.
Original language | English |
---|---|
Title of host publication | Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS 2019) |
Place of Publication | Naha, Okinawa, Japan |
Publisher | PMLR |
Pages | 1584-1592 |
Number of pages | 14 |
Volume | 89 |
Publication status | Published - 25 Apr 2019 |
Event | 22nd International Conference on Artificial Intelligence and Statistics - Naha, Japan Duration: 16 Apr 2019 → 18 Apr 2019 https://www.aistats.org/ |
Publication series
Name | Proceedings of Machine Learning Research |
---|---|
Publisher | PMLR |
Volume | 89 |
ISSN (Electronic) | 2640-3498 |
Conference
Conference | 22nd International Conference on Artificial Intelligence and Statistics |
---|---|
Abbreviated title | AISTATS 2019 |
Country/Territory | Japan |
City | Naha |
Period | 16/04/19 → 18/04/19 |
Internet address |