Abstract
Many generative models can be expressed as a differentiable function of random inputs drawn from some simple probability density. This framework includes both deep generative architectures such as Variational Autoencoders and a large class of procedurally defined simulator models. We present a method for performing efficient MCMC inference in such models when conditioning on observations of the model output. For some models this offers an asymptotically exact inference method where Approximate Bayesian Computation might otherwise be employed. We use the intuition that inference corresponds to integrating a density across the manifold corresponding to the set of inputs consistent with the observed outputs. This motivates the use of a constrained variant of Hamiltonian Monte Carlo which leverages the smooth geometry of the manifold to coherently move between inputs exactly consistent with observations. We validate the method by performing inference tasks in a diverse set of models.
Original language | English |
---|---|
Title of host publication | Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS) 2017 |
Publisher | PMLR |
Pages | 499-508 |
Number of pages | 10 |
Publication status | Published - 22 Apr 2017 |
Event | 20th International Conference on Artificial Intelligence and Statistics - Fort Lauderdale, United States Duration: 20 Apr 2017 → 22 Apr 2017 https://www.aistats.org/aistats2017/ |
Publication series
Name | Proceedings of Machine Learning Research |
---|---|
Publisher | PMLR |
Volume | 54 |
ISSN (Electronic) | 2640-3498 |
Conference
Conference | 20th International Conference on Artificial Intelligence and Statistics |
---|---|
Abbreviated title | AI-Statistics 2017 |
Country/Territory | United States |
City | Fort Lauderdale |
Period | 20/04/17 → 22/04/17 |
Internet address |