Sentence Compression Beyond Word Deletion

Trevor Cohn, Mirella Lapata

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

In this paper we generalise the sentence compression task. Rather than simply shorten a sentence by deleting words or constituents, as in previous work, we rewrite it using additional operations such as substitution, reordering, and insertion. We present a new corpus that is suited to our task and a discriminative tree-to-tree transduction model that can naturally account for structural and lexical mismatches. The model incorporates a novel grammar extraction method, uses a language model for coherent output, and can be easily tuned to a wide range of compression specific loss functions.
Original languageEnglish
Title of host publicationProceedings of the 22nd International Conference on Computational Linguistics (Coling 2008)
PublisherAssociation for Computational Linguistics
Pages137-144
Number of pages8
Publication statusPublished - 2008

Fingerprint

Dive into the research topics of 'Sentence Compression Beyond Word Deletion'. Together they form a unique fingerprint.

Cite this