Neural Discontinuous Constituency Parsing

Milos Stanojevic, Raquel G. Alhama

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

One of the most pressing issues in discontinuous constituency transition-based parsing is that the relevant information for parsing decisions could be located in any part of the stack or the buffer. In this paper, we propose a solution to this problem by replacing the structured perceptron model with a recursive neural model that computes a global representation of the configuration, therefore allowing even the most remote parts of the configuration to influence the parsing decisions. We also provide a detailed analysis of how this representation should be built out of sub-representations of its core elements (words, trees and stack). Additionally, we investigate how different types of swap oracles influence the results. Our model is the first neural discontinuous constituency parser, and it outperforms all the previously published models on three out of four datasets while on the fourth it obtains second place by a tiny difference.
Original languageEnglish
Title of host publicationProceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Place of PublicationCopenhagen, Denmark
PublisherAssociation for Computational Linguistics (ACL)
Pages1666-1676
Number of pages11
DOIs
Publication statusPublished - 11 Sept 2017
EventEMNLP 2017: Conference on Empirical Methods in Natural Language Processing - Copenhagen, Denmark
Duration: 7 Sept 201711 Sept 2017
http://emnlp2017.net/index.html
http://emnlp2017.net/

Conference

ConferenceEMNLP 2017: Conference on Empirical Methods in Natural Language Processing
Abbreviated titleEMNLP 2017
Country/TerritoryDenmark
CityCopenhagen
Period7/09/1711/09/17
Internet address

Fingerprint

Dive into the research topics of 'Neural Discontinuous Constituency Parsing'. Together they form a unique fingerprint.

Cite this