Incremental, Predictive Parsing with Psycholinguistically Motivated Tree-Adjoining Grammar

Vera Demberg, Frank Keller, Alexander Koller

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

Psycholinguistic research shows that key properties of the human sentence processor are incrementality, connectedness (partial structures contain no unattached nodes), and prediction (upcoming syntactic structure is anticipated). However, there is currently no broad-coverage parsing model with these properties. In this article, we present the first broad-coverage probabilistic parser for PLTAG, a variant of TAG which supports all three requirements. We train our parser on a TAG-transformed version of the Penn Treebank and show that it achieves performance comparable to existing TAG parsers that are incremental but not predictive. We also use our PLTAG model to predict human reading times, demonstrating a better fit on the Dundee eye-tracking corpus than a standard surprisal model.
Original languageEnglish
Pages (from-to)1025-1066
Number of pages42
JournalComputational Linguistics
Volume39
Issue number4
Early online date20 Mar 2013
DOIs
Publication statusPublished - Dec 2013

Fingerprint

Dive into the research topics of 'Incremental, Predictive Parsing with Psycholinguistically Motivated Tree-Adjoining Grammar'. Together they form a unique fingerprint.

Cite this