Empirical Risk Minimization for Probabilistic Grammars: Sample Complexity and Hardness of Learning

S. B. Cohen, N. A. Smith

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

Probabilistic grammars are generative statistical models that are useful for compositional and sequential structures. They are used ubiquitously in computational linguistics. We present a framework, reminiscent of structural risk minimization, for empirical risk minimization of probabilistic grammars using the log-loss. We derive sample complexity bounds in this framework that apply both to the supervised setting and the unsupervised setting. By making assumptions about the underlying distribution that are appropriate for natural language scenarios, we are able to derive distribution-dependent sample complexity bounds for probabilistic grammars. We also give simple algorithms for carrying out empirical risk minimization using this framework in both the supervised and unsupervised settings. In the unsupervised case, we show that the problem of minimizing empirical risk is NP-hard. We therefore suggest an approximate algorithm, similar to expectation-maximization, to minimize the empirical risk.
Original languageUndefined/Unknown
Pages (from-to)479-526
Number of pages48
JournalComputational Linguistics
Volume38
Issue number3
Publication statusPublished - 2012

Cite this