Abstract
We present a self-training approach to unsupervised dependency parsing that reuses existing supervised and unsupervised parsing algorithms. Our approach, called ‘iterated reranking’ (IR), starts with dependency trees generated by an unsupervised parser, and iteratively improves these trees using the richer probability models used in supervised parsing that are in turn trained on these trees. Our system achieves 1.8% accuracy higher than the state-of-the-part parser of Spitkovsky et al. (2013) on the WSJ corpus.
Original language | English |
---|---|
Title of host publication | Human Language Technologies: The 2015 Annual Conference of the North American Chapter of the ACL |
Place of Publication | Denver, Colorado |
Publisher | Association for Computational Linguistics |
Pages | 651-661 |
Number of pages | 11 |
DOIs | |
Publication status | Published - May 2015 |
Event | 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - Denver, United States Duration: 31 May 2015 → 5 Jun 2015 http://naacl.org/naacl-hlt-2015/ |
Conference
Conference | 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies |
---|---|
Abbreviated title | NAACL HLT 2015 |
Country/Territory | United States |
City | Denver |
Period | 31/05/15 → 5/06/15 |
Internet address |