A Bayesian Mixture Model for PoS Induction Using Multiple Features

Christos Christodoulopoulos, Sharon Goldwater, Mark Steedman

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

In this paper we present a fully unsupervised syntactic class induction system formulated as a Bayesian multinomial mixture model, where each word type is constrained to belong to a single class. By using a mixture model rather than a sequence model (e.g., HMM), we are able to easily add multiple kinds of features, including those at both the type level (morphology features) and token level (context and alignment features, the latter from parallel corpora). Using only context features, our system yields results comparable to state-of-the art, far better than a similar model without the one-class-per-type constraint. Using the additional features provides added benefit, and our final system outperforms the best published results on most of the 25 corpora tested.
Original languageEnglish
Title of host publicationProceedings of the 2011 Conference on Empirical Methods in Natural Language Processing
Place of PublicationEdinburgh, Scotland, UK.
PublisherAssociation for Computational Linguistics
Pages638-647
Number of pages10
Publication statusPublished - 1 Jul 2011

Fingerprint

Dive into the research topics of 'A Bayesian Mixture Model for PoS Induction Using Multiple Features'. Together they form a unique fingerprint.

Cite this