As Little as Possible, as Much as Necessary: Detecting Over- and Undertranslations with Contrastive Conditioning

Jannis Vamvas, Rico Sennrich

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

Omission and addition of content is a typical issue in neural machine translation. We propose a method for detecting such phenomena with off-the-shelf translation models. Using contrastive conditioning, we compare the likelihood of a full sequence under a translation model to the likelihood of its parts, given the corresponding source or target sequence. This allows to pinpoint superfluous words in the translation and untranslated words in the source even in the absence of a reference translation. The accuracy of our method is comparable to a supervised method that requires a custom quality estimation model.
Original languageEnglish
Title of host publicationProceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
EditorsSmaranda Muresan, Preslav Nakov, Aline Vlliavicencio
PublisherAssociation for Computational Linguistics
Pages490-500
Number of pages11
Volume2
ISBN (Print)978-1-955917-22-3
Publication statusPublished - 16 May 2022
Event60th Annual Meeting of the Association for Computational Linguistics - The Convention Centre Dublin, Dublin, Ireland
Duration: 22 May 202227 May 2022
https://www.2022.aclweb.org

Conference

Conference60th Annual Meeting of the Association for Computational Linguistics
Abbreviated titleACL 2022
Country/TerritoryIreland
CityDublin
Period22/05/2227/05/22
Internet address

Fingerprint

Dive into the research topics of 'As Little as Possible, as Much as Necessary: Detecting Over- and Undertranslations with Contrastive Conditioning'. Together they form a unique fingerprint.

Cite this