Deep Graph Convolutional Encoders for Structured Data to Text Generation

Diego Marcheggiani, Laura Perez-Beltrachini

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Most previous work on neural text generation from graph-structured data relies on standard sequence-to-sequence methods. These approaches linearise the input graph to be fed to a recurrent neural network. In this paper, we propose an alternative encoder based on graph convolutional networks that directly exploits the input structure. We report results on two graphto-sequence datasets that empirically show the benefits of explicitly encoding the input graph structure.
Original languageEnglish
Title of host publicationProceedings of the 11th International Conference on Natural Language Generation
Place of PublicationTilburg University, The Netherlands
PublisherAssociation for Computational Linguistics
Pages1-9
Number of pages9
Publication statusPublished - Nov 2018
Event11th International Conference on Natural Language Generation - Tilburg, Netherlands
Duration: 5 Nov 20188 Nov 2018
https://inlg2018.uvt.nl/

Conference

Conference11th International Conference on Natural Language Generation
Abbreviated titleINLG 2018
Country/TerritoryNetherlands
CityTilburg
Period5/11/188/11/18
Internet address

Fingerprint

Dive into the research topics of 'Deep Graph Convolutional Encoders for Structured Data to Text Generation'. Together they form a unique fingerprint.

Cite this