Single Document Summarization as Tree Induction

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

In this paper we conceptualize single-document extractive summarization as a tree induction problem. In contrast to previous approaches (Marcu, 1999; Yoshida et al., 2014) which have relied on linguistically motivated document representations to generate summaries, our model induces a multi-root dependency tree while predicting the output summary. Each root node in the tree is a summary sentence, and the subtrees attached to it are sentences whose content relates to or explains the summary sentence. We design a new iterative refinement algorithm: it induces the trees through repeatedly refining the structures predicted by previous iterations. We demonstrate experimentally on two benchmark datasets that our summarizer1 performs competitively against state-of-the-art methods.
1: Our code is publicly available at https://github.com/nlpyang/SUMO. 
Original languageEnglish
Title of host publicationProceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics
EditorsJill Burstein, Christy Doran, Thamar Solorio
Place of PublicationMinneapolis, Minnesota
PublisherAssociation for Computational Linguistics (ACL)
Pages1745–1755
Number of pages11
Volume1
DOIs
Publication statusPublished - 7 Jun 2019
Event2019 Annual Conference of the North American Chapter of the Association for Computational Linguistics - Minneapolis, United States
Duration: 2 Jun 20197 Jun 2019
https://naacl2019.org/

Conference

Conference2019 Annual Conference of the North American Chapter of the Association for Computational Linguistics
Abbreviated titleNAACL-HLT 2019
Country/TerritoryUnited States
CityMinneapolis
Period2/06/197/06/19
Internet address

Fingerprint

Dive into the research topics of 'Single Document Summarization as Tree Induction'. Together they form a unique fingerprint.

Cite this