Dependency Parsing as Head Selection

Xingxing Zhang, Jianpeng Cheng, Maria Lapata

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Conventional graph-based dependency parsers guarantee a tree structure both
during training and inference. Instead, we formalize dependency parsing as the problem of independently selecting the head of each word in a sentence. Our model which we call DENSE (as shorthand for Dependency Neural Selection) produces a distribution over possible heads for each word using features obtained from a bidirectional recurrent neural network. Without enforcing structural constraints during training, DENSE generates (at inference time) trees for the overwhelming majority of sentences, while non-tree outputs can be adjusted with a maximum spanning tree algorithm. We evaluate DENSE on four languages (English, Chinese, Czech, and German) with varying degrees of non-projectivity. Despite the simplicity of the approach, our parsers are on par with the state of the art.
Original languageEnglish
Title of host publicationProceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers
PublisherAssociation for Computational Linguistics (ACL)
Number of pages12
ISBN (Print)978-1-945626-34-0
Publication statusPublished - 7 Apr 2017
Event15th EACL 2017 Software Demonstrations - Valencia, Spain
Duration: 3 Apr 20177 Apr 2017


Conference15th EACL 2017 Software Demonstrations
Abbreviated titleEACL 2017
Internet address


Dive into the research topics of 'Dependency Parsing as Head Selection'. Together they form a unique fingerprint.

Cite this