Adaptor Grammars: A Framework for Specifying Compositional Nonparametric Bayesian Models

Mark Johnson, Thomas L. Griffiths, Sharon Goldwater

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

This paper introduces adaptor grammars, a class of probabilistic models of language that generalize probabilistic context-free grammars (PCFGs). Adaptor grammars augment the probabilistic rules of PCFGs with "adaptors" that can induce dependencies among successive uses. With a particular choice of adaptor, based on the Pitman-Yor process, nonparametric Bayesian models of language using Dirichlet processes and hierarchical Dirichlet processes can be written as simple grammars. We present a general-purpose inference algorithm for adaptor grammars, making it easy to define and use such models, and illustrate how several existing nonparametric Bayesian models can be expressed within this framework.
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 19
EditorsB. Schölkopf, J. Platt, T. Hoffman
Place of PublicationCambridge, MA
PublisherMIT Press
Pages641-648
Number of pages8
Publication statusPublished - 2007

Fingerprint

Dive into the research topics of 'Adaptor Grammars: A Framework for Specifying Compositional Nonparametric Bayesian Models'. Together they form a unique fingerprint.

Cite this