Abstract / Description of output
Stochastic control-flow models (SCFMs) are a class of generative models that involve branching on choices from discrete random variables. Amortized gradient-based learning of SCFMs is challenging as most approaches targeting discrete variables rely on their continuous relaxations—which can be intractable in SCFMs, as branching on relaxations requires evaluating all (exponentially many) branching paths. Tractable alternatives mainly combine REINFORCE with complex control-variate schemes to improve the variance of naive estimators. Here, we revisit the reweighted wake-sleep (RWS) [5] algorithm, and through extensive evaluations, show that it outperforms current state-of-the-art methods in learning SCFMs. Further, in contrast to the importance weighted autoencoder, we observe that RWS learns better models and inference networks with increasing numbers of particles. Our results suggest that RWS is a competitive, often preferable, alternative for learning SCFMs.
Original language | English |
---|---|
Title of host publication | Proceedings of the 35th Conference on Uncertainty in Artificial Intelligence |
Publisher | Association for Uncertainty in Artificial Intelligence (AUAI) |
Number of pages | 11 |
Publication status | Published - 25 Jul 2019 |
Event | 35th Conference on Uncertainty in Artificial Intelligence, UAI 2019 - Tel Aviv, Israel Duration: 22 Jul 2019 → 25 Jul 2019 http://auai.org/uai2019/ |
Conference
Conference | 35th Conference on Uncertainty in Artificial Intelligence, UAI 2019 |
---|---|
Abbreviated title | UAI 2019 |
Country/Territory | Israel |
City | Tel Aviv |
Period | 22/07/19 → 25/07/19 |
Internet address |
Fingerprint
Dive into the research topics of 'Revisiting Reweighted Wake-Sleep for Models with Stochastic Control Flow'. Together they form a unique fingerprint.Profiles
-
Siddharth N
- School of Informatics - Reader in Explainable Artificial Intelligence
- Artificial Intelligence and its Applications Institute
- Data Science and Artificial Intelligence
Person: Academic: Research Active