Learning OT constraint rankings using a maximum entropy model

Sharon Goldwater, M Johnson

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

A weakness of standard Optimality Theory is its inability to account for grammars with free variation. We describe here the Maximum Entropy model, a general statistical model, and show how it can be applied in a constraint-based linguistic framework to model and learn grammars with free variation, as well as categorical grammars. We report the results of using the MaxEnt model for learning two different grammars: one with variation, and one without. Our results are as good as those of a previous probabilistic version of OT, the Gradual Learning Algorithm (Boersma, 1997), and we argue that our model is more general and mathematically well-motivated.
Original languageEnglish
Title of host publicationProceedings of the Workshop on Variation within Optimality Theory
Number of pages10
Publication statusPublished - 2003


Dive into the research topics of 'Learning OT constraint rankings using a maximum entropy model'. Together they form a unique fingerprint.

Cite this