Amortized Inference for Latent Feature Models Using Variational Russian Roulette

Kai Xu, Akash Srivastava, Charles Sutton

Research output: Contribution to conferencePaperpeer-review

Abstract / Description of output

The Indian buffet process (IBP) provides a principled prior distribution for inferring the number of latent features in a dataset. Traditionally, inference for these models is slow when applied to large datasets, which motivates the use of amortized neural inference methods. However, previous works on variational inference for these models require the use of a truncated approximation, in which the maximum number of features is predetermined. To address this problem, we present a new dynamic variational posterior by introducing auxiliary variables to the stick-breaking construction of IBP. We describe how to estimate the evidence lower bound, which contains summations of infinite terms, using Russian roulette sampling.
Original languageEnglish
Number of pages11
Publication statusPublished - 2018
EventBayesian Nonparametrics workshop at the thirty-second Conference on Neural Information Processing Systems - Montreal, Canada
Duration: 7 Dec 20187 Dec 2018
https://sites.google.com/view/nipsbnp2018/

Workshop

WorkshopBayesian Nonparametrics workshop at the thirty-second Conference on Neural Information Processing Systems
Abbreviated titleBNP@NIPS 2018
Country/TerritoryCanada
CityMontreal
Period7/12/187/12/18
Internet address

Fingerprint

Dive into the research topics of 'Amortized Inference for Latent Feature Models Using Variational Russian Roulette'. Together they form a unique fingerprint.

Cite this