Inducing Probabilistic CCG Grammars from Logical Form with Higher-Order Unification

Tom Kwiatkowski, Luke Zettlemoyer, Sharon Goldwater, Mark Steedman

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

This paper addresses the problem of learning to map sentences to logical form, given training data consisting of natural language sentences paired with logical representations of their meaning. Previous approaches have been designed for particular natural languages or specific meaning representations; here we present a more general method. The approach induces a probabilistic CCG grammar that represents the meaning of individual words and defines how these meanings can be combined to analyze complete sentences. We use higher-order unification to define a hypothesis space containing all grammars consistent with the training data, and develop an online learning algorithm that efficiently searches this space while simultaneously estimating the parameters of a log-linear parsing model. Experiments demonstrate high accuracy on benchmark data sets in four languages with two different meaning representations.
Original languageEnglish
Title of host publicationProceedings of the Conference on Empirical Methods in Natural Language Processing
Place of PublicationCambridge, MA
PublisherASSOC COMPUTATIONAL LINGUISTICS-ACL
Pages1223-1233
Number of pages11
ISBN (Print)978-1-932432-86-2
Publication statusPublished - 2010

Fingerprint

Dive into the research topics of 'Inducing Probabilistic CCG Grammars from Logical Form with Higher-Order Unification'. Together they form a unique fingerprint.

Cite this