Computational Grammar Acquisition from CHILDES Data Using a Probabilistic Parsing Model

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

In this work we propose a universal model of syntactic acquisition that assumes the learner is exposed to pairs consisting of strings of word-candidates and contextually-afforded meaning-representations.Previous attempts to model the learning of syntax(Siskind 1992, 1995, 1996; Villavicencio 2002; Yang2002; Buttery 2003) have tended to adopt a “parameter setting”approach (Hyams 1986; Gibson & Wexler 1995; Fodor 1998). However, recent work in the related task of inducing a grammar from a corpus of paired English sentences and database queries (Zettlemoyer & Collins 2005, Zettlemoyer & Collins 2007, Wong& Mooney 2007, Lu et al. 2008) has shown that it is possible to learn grammars without this “switch like”mechanism by using the structure of the meaning representation to bootstrap the syntactic learning procedure.The present paper shows that these related method scan be generalized to provide a universal model of child language acquisition and our model is designed to be psycholinguistically plausible: the initialisation of the grammar is language independent and should be able to learn any plausible word order; and the model learns in a sequential manner from sentence - meaning pairs.For the purposes of this paper, we present only the case of learning from unambiguous sentence-meaning pairs.However, the principles used will extend to the case of learning in the face of spurious distracting meaning candidates that are contextually supported but irrelevant to the utterance.
Original languageEnglish
Title of host publicationPsychocomputational Models of Human Language Acquisition (PsychoCompLA 2009)
Number of pages4
Publication statusPublished - 2009


Dive into the research topics of 'Computational Grammar Acquisition from CHILDES Data Using a Probabilistic Parsing Model'. Together they form a unique fingerprint.

Cite this