Abstract / Description of output
A weakness of standard Optimality Theory is its inability to account for grammars with free variation. We describe here the Maximum Entropy model, a general statistical model, and show how it can be applied in a constraint-based linguistic framework to model and learn grammars with free variation, as well as categorical grammars. We report the results of using the MaxEnt model for learning two different grammars: one with variation, and one without. Our results are as good as those of a previous probabilistic version of OT, the Gradual Learning Algorithm (Boersma, 1997), and we argue that our model is more general and mathematically well-motivated.
Original language | English |
---|---|
Title of host publication | Proceedings of the Workshop on Variation within Optimality Theory |
Pages | 111-120 |
Number of pages | 10 |
Publication status | Published - 2003 |