Semantic Graph Parsing with Recurrent Neural Network DAG Grammars

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Semantic parses are directed acyclic graphs (DAGs), so semantic parsing should be modeled as graph prediction. But predicting graphs presents difficult technical challenges, so it is simpler and more common to predict the linearized graphs found in semantic parsing datasets using well-understood sequence models. The cost of this simplicity is that the predicted strings may not be wellformed graphs. We present recurrent neural network DAG grammars, a graph-aware sequence model that ensures only well-formed graphs while sidestepping many difficulties in graph prediction. We test our model on the Parallel Meaning Bank—a multilingual semantic graphbank. Our approach yields competitive results in English and establishes the first results for German, Italian and Dutch.
Original languageEnglish
Title of host publicationProceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing
PublisherAssociation for Computational Linguistics
Pages2769–2778
Number of pages13
ISBN (Print)978-1-950737-90-1
DOIs
Publication statusPublished - 4 Nov 2019
Event2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing - Hong Kong, Hong Kong
Duration: 3 Nov 20197 Nov 2019
https://www.emnlp-ijcnlp2019.org/

Conference

Conference2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing
Abbreviated titleEMNLP-IJCNLP 2019
CountryHong Kong
CityHong Kong
Period3/11/197/11/19
Internet address

Fingerprint

Dive into the research topics of 'Semantic Graph Parsing with Recurrent Neural Network DAG Grammars'. Together they form a unique fingerprint.

Cite this