Conditional Sampling of Variational Autoencoders via Iterated Approximate Ancestral Sampling

Vaidotas Simkus, Michael U. Gutmann

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

Conditional sampling of variational autoencoders (VAEs) is needed in various applications, such as missing data imputation, but is computationally intractable. A principled choice for asymptotically exact conditional sampling is Metropolis-within-Gibbs (MWG). However, we observe that the tendency of VAEs to learn a structured latent space, a commonly desired property, can cause the MWG sampler to get “stuck” far from the target distribution. This paper mitigates the limitations of MWG: we systematically outline the pitfalls in the context of VAEs, propose two original methods that address these pitfalls, and demonstrate an improved performance of the proposed methods on a set of sampling tasks.
Original languageEnglish
Pages (from-to)1-35
Number of pages35
JournalTransactions on Machine Learning Research
Issue number11
Publication statusPublished - 8 Nov 2023


Dive into the research topics of 'Conditional Sampling of Variational Autoencoders via Iterated Approximate Ancestral Sampling'. Together they form a unique fingerprint.

Cite this