Edinburgh Research Explorer

Jointly Extracting and Compressing Documents with Summary State Representations

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Related Edinburgh Organisations

Open Access permissions

Open

Documents

https://aclweb.org/anthology/papers/N/N19/N19-1397/
Original languageEnglish
Title of host publicationProceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics
EditorsJill Burstein, Christy Doran, Thamar Solorio
Place of PublicationMinneapolis, Minnesota
PublisherAssociation for Computational Linguistics (ACL)
Pages3955–3966
Number of pages20
Volume1
DOIs
Publication statusPublished - 7 Jun 2019
Event2019 Annual Conference of the North American Chapter of the Association for Computational Linguistics - Minneapolis, United States
Duration: 2 Jun 20197 Jun 2019
https://naacl2019.org/

Conference

Conference2019 Annual Conference of the North American Chapter of the Association for Computational Linguistics
Abbreviated titleNAACL-HLT 2019
CountryUnited States
CityMinneapolis
Period2/06/197/06/19
Internet address

Abstract

We present a new neural model for text summarization that first extracts sentences from a document and then compresses them. The proposed model offers a balance that sidesteps the difficulties in abstractive methods while generating more concise summaries than extractive methods. In addition, our model dynamically determines the length of the output summary based on the gold summaries it observes during training, and does not require length constraints typical to extractive summarization. The model achieves state-of-the-art results on the CNN/DailyMail and Newsroom datasets, improving over current extractive and abstractive methods. Human evaluations demonstrate that our model generates concise and informative summaries. We also make available a new dataset of oracle compressive summaries derived automatically from the CNN/DailyMail reference summaries.1

1: Our dataset and code is available at https://github.com/Priberam/exconsumm

Download statistics

No data available

ID: 85694561