Stochastic Context-Free Grammars, Regular Languages, and Newton's Method

Kousha Etessami, Alistair Stewart, Mihalis Yannakakis

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We study the problem of computing the probability that a given stochastic context-free grammar (SCFG), G, generates a string in a given regular language L(D) (given by a DFA, D). This basic problem has a number of applications in statistical natural language processing, and it is also a key necessary step towards quantitative ω-regular model checking of stochastic context-free processes (equivalently, 1-exit recursive Markov chains, or stateless probabilistic pushdown processes).
We show that the probability that G generates a string in L(D) can be computed to within arbitrary desired precision in polynomial time (in the standard Turing model of computation), under a rather mild assumption about the SCFG, G, and with no extra assumption about D. We show that this assumption is satisfied for SCFG’s whose rule probabilities are learned via the well-known inside-outside (EM) algorithm for maximum-likelihood estimation (a standard method for constructing SCFGs in statistical NLP and biological sequence analysis). Thus, for these SCFGs the algorithm always runs in P-time.
Original languageEnglish
Title of host publication40th International Colloquium, ICALP 2013, Riga, Latvia, July 8-12, 2013, Proceedings, Part II
PublisherSpringer-Verlag Berlin Heidelberg
Pages199-211
Number of pages13
ISBN (Electronic)978-3-642-39212-2
ISBN (Print)978-3-642-39211-5
DOIs
Publication statusPublished - 2013

Publication series

NameLecture Notes in Computer Science
PublisherSpringer Berlin Heidelberg
Volume7966
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Fingerprint

Dive into the research topics of 'Stochastic Context-Free Grammars, Regular Languages, and Newton's Method'. Together they form a unique fingerprint.

Cite this