Edinburgh Research Explorer

Amortized Inference for Latent Feature Models Using Variational Russian Roulette

Research output: Contribution to conferencePaper

Related Edinburgh Organisations

Open Access permissions

Open

Original languageEnglish
Number of pages11
Publication statusPublished - 2018
EventBayesian Nonparametrics workshop at the thirty-second Conference on Neural Information Processing Systems - Montreal, Canada
Duration: 7 Dec 20187 Dec 2018
https://sites.google.com/view/nipsbnp2018/

Workshop

WorkshopBayesian Nonparametrics workshop at the thirty-second Conference on Neural Information Processing Systems
Abbreviated titleBNP@NIPS 2018
CountryCanada
CityMontreal
Period7/12/187/12/18
Internet address

Abstract

The Indian buffet process (IBP) provides a principled prior distribution for inferring the number of latent features in a dataset. Traditionally, inference for these models is slow when applied to large datasets, which motivates the use of amortized neural inference methods. However, previous works on variational inference for these models require the use of a truncated approximation, in which the maximum number of features is predetermined. To address this problem, we present a new dynamic variational posterior by introducing auxiliary variables to the stick-breaking construction of IBP. We describe how to estimate the evidence lower bound, which contains summations of infinite terms, using Russian roulette sampling.

Download statistics

No data available

ID: 80259335