Approximating Posterior Distributions in Belief Networks Using Mixtures

Christopher M. Bishop, Neil Lawrence, Tommi Jaakkola, Michael I. Jordan

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

Exact inference in densely connected Bayesian networks is computation- ally intractable, and so there is considerable interest in developing effective approximation schemes. One approach which has been adopted is to bound the log likelihood using a mean-field approximating distribution. While this leads to a tractable algorithm, the mean field distribution is assumed to be factorial and hence unimodal. In this paper we demonstrate the feasibility of using a richer class of approximating distributions based on mixtures of mean field distributions. We derive an efficient algorithm for updating the mixture parameters and apply it to the problem of learning in sigmoid belief networks. Our results demonstrate a systematic improvement over simple mean field theory as the number of mixture components is increased.
Original languageEnglish
Title of host publicationProceedings of the 1997 Conference on Advances in Neural Information Processing Systems 10
PublisherMIT Press
Pages416-422
Number of pages7
ISBN (Print)0-262-10076-2
Publication statusPublished - 1998

Fingerprint

Dive into the research topics of 'Approximating Posterior Distributions in Belief Networks Using Mixtures'. Together they form a unique fingerprint.

Cite this