A model-learner pattern for bayesian reasoning

Andrew D. Gordon, Mihhail Aizatulin, Johannes Borgstrom, Guillaume Claret, Thore Graepel, Aditya V. Nori, Sriram K. Rajamani, Claudio Russo

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

A Bayesian model is based on a pair of probability distributions, known as the prior and sampling distributions. A wide range of fundamental machine learning tasks, including regression, classification, clustering, and many others, can all be seen as Bayesian models. We propose a new probabilistic programming abstraction, a typed Bayesian model, which is based on a pair of probabilistic expressions for the prior and sampling distributions. A sampler for a model is an algorithm to compute synthetic data from its sampling distribution, while a learner for a model is an algorithm for probabilistic inference on the model. Models, samplers, and learners form a generic programming pattern for model-based inference. They support the uniform expression of common tasks including model testing, and generic compositions such as mixture models, evidence-based model averaging, and mixtures of experts. A formal semantics supports reasoning about model equivalence and implementation correctness. By developing a series of examples and three learner implementations based on exact inference, factor graphs, and Markov chain Monte Carlo, we demonstrate the broad applicability of this new programming pattern.
Original languageEnglish
Title of host publicationProceedings of the 40th annual ACM SIGPLAN-SIGACT symposium on Principles of programming languages
Place of PublicationNew York, NY, USA
PublisherACM
Pages403-416
Number of pages14
ISBN (Print)978-1-4503-1832-7
DOIs
Publication statusPublished - 2013

Keywords / Materials (for Non-textual outputs)

  • bayesian reasoning, machine learning, model-learner pattern, probabilistic programming

Fingerprint

Dive into the research topics of 'A model-learner pattern for bayesian reasoning'. Together they form a unique fingerprint.

Cite this