Neural Discontinuous Constituency Parsing

Milos Stanojevic, Raquel G. Alhama

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

One of the most pressing issues in discontinuous constituency transition-based parsing is that the relevant information for parsing decisions could be located in any part of the stack or the buffer. In this paper, we propose a solution to this problem by replacing the structured perceptron model with a recursive neural model that computes a global representation of the configuration, therefore allowing even the most remote parts of the configuration to influence the parsing decisions. We also provide a detailed analysis of how this representation should be built out of sub-representations of its core elements (words, trees and stack). Additionally, we investigate how different types of swap oracles influence the results. Our model is the first neural discontinuous constituency parser, and it outperforms all the previously published models on three out of four datasets while on the fourth it obtains second place by a tiny difference.
Original languageEnglish
Title of host publicationProceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Place of PublicationCopenhagen, Denmark
PublisherAssociation for Computational Linguistics (ACL)
Number of pages11
Publication statusPublished - 11 Sept 2017
EventEMNLP 2017: Conference on Empirical Methods in Natural Language Processing - Copenhagen, Denmark
Duration: 7 Sept 201711 Sept 2017


ConferenceEMNLP 2017: Conference on Empirical Methods in Natural Language Processing
Abbreviated titleEMNLP 2017
Internet address


Dive into the research topics of 'Neural Discontinuous Constituency Parsing'. Together they form a unique fingerprint.

Cite this