The Shared Logistic Normal Distribution for Grammar Induction

Shay Cohen, Noah B. Smith

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present a family of priors over probabilistic grammar weights, called the shared logistic normal distribution. This family extends the partitioned logistic normal distribution, enabling factored covariance between the probabilities of different derivation events in the probabilistic grammar, providing a new way to encode prior knowledge about an unknown grammar. We describe a variational EM algorithm for learning a probabilistic grammar based on this family of priors. We then experiment with unsupervised dependency grammar induction and show significant improvements using our model for both monolingual learning and bilingual learning with a non-parallel, multilingual corpus.
Original languageEnglish
Title of host publicationProceedings of the NIPS Workshop on Speech and Language: Unsupervised Latent-Variable Models
Number of pages9
Publication statusPublished - 2008

Fingerprint

Dive into the research topics of 'The Shared Logistic Normal Distribution for Grammar Induction'. Together they form a unique fingerprint.

Cite this