The Inside-Outside Recursive Neural Network model for Dependency Parsing

Phong Le, Willem Zuidema

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

We propose the first implementation of an infinite-order generative dependency model. The model is based on a new recursive neural network architecture, the Inside-Outside Recursive Neural Network. This architecture allows information to flow not only bottom-up, as in traditional recursive neural networks, but also topdown. This is achieved by computing content as well as context representations for any constituent, and letting these representations interact. Experimental results on the English section of the Universal Dependency Treebank show that the infinite-order model achieves a perplexity seven times lower than the traditional third-order model using counting, and tends to choose more accurate parses in k-best lists. In addition, reranking with this model achieves state-of-the-art unlabelled attachment scores and unlabelled exact match scores.
Original languageEnglish
Title of host publicationProceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Place of PublicationDoha, Qatar
PublisherAssociation for Computational Linguistics
Pages729-739
Number of pages11
DOIs
Publication statusPublished - Oct 2014
Event2014 Conference on Empirical Methods in Natural Language Processing - Doha, Qatar
Duration: 25 Oct 201429 Oct 2014
http://emnlp2014.org/

Conference

Conference2014 Conference on Empirical Methods in Natural Language Processing
Abbreviated titleEMNLP 2014
Country/TerritoryQatar
CityDoha
Period25/10/1429/10/14
Internet address

Fingerprint

Dive into the research topics of 'The Inside-Outside Recursive Neural Network model for Dependency Parsing'. Together they form a unique fingerprint.

Cite this