Abstract
Bayesian inference is now widely established as one of the principal foundations for machine learning.
In practice, exact inference is rarely possible, and so a variety of approximation techniques
have been developed, one of the most widely used being a deterministic framework called variational
inference. In this paper we introduce Variational Message Passing (VMP), a general purpose
algorithm for applying variational inference to Bayesian Networks. Like belief propagation, VMP
proceeds by sending messages between nodes in the network and updating posterior beliefs using
local operations at each node. Each such update increases a lower bound on the log evidence
(unless already at a local maximum). In contrast to belief propagation, VMP can be applied to a
very general class of conjugate-exponential models because it uses a factorised variational approximation.
Furthermore, by introducing additional variational parameters, VMP can be applied to
models containing non-conjugate distributions. The VMP framework also allows the lower bound
to be evaluated, and this can be used both for model comparison and for detection of convergence.
Variational message passing has been implemented in the form of a general purpose inference engine
called VIBES (‘Variational Inference for BayEsian networkS’) which allows models to be
specified graphically and then solved variationally without recourse to coding.
Original language | English |
---|---|
Pages (from-to) | 661-694 |
Number of pages | 34 |
Journal | Journal of Machine Learning Research |
Volume | 6 |
Publication status | Published - 2005 |