Unsupervised Dependency Parsing: Let’s Use Supervised Parsers

Phong Le, Willem Zuidema

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present a self-training approach to unsupervised dependency parsing that reuses existing supervised and unsupervised parsing algorithms. Our approach, called ‘iterated reranking’ (IR), starts with dependency trees generated by an unsupervised parser, and iteratively improves these trees using the richer probability models used in supervised parsing that are in turn trained on these trees. Our system achieves 1.8% accuracy higher than the state-of-the-part parser of Spitkovsky et al. (2013) on the WSJ corpus.
Original languageEnglish
Title of host publicationHuman Language Technologies: The 2015 Annual Conference of the North American Chapter of the ACL
Place of PublicationDenver, Colorado
PublisherAssociation for Computational Linguistics
Pages651-661
Number of pages11
DOIs
Publication statusPublished - May 2015
Event2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - Denver, United States
Duration: 31 May 20155 Jun 2015
http://naacl.org/naacl-hlt-2015/

Conference

Conference2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Abbreviated titleNAACL HLT 2015
Country/TerritoryUnited States
CityDenver
Period31/05/155/06/15
Internet address

Fingerprint

Dive into the research topics of 'Unsupervised Dependency Parsing: Let’s Use Supervised Parsers'. Together they form a unique fingerprint.

Cite this