Context and Copying in Neural Machine Translation

Rebecca Knowles, Philipp Koehn

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Neural machine translation systems with subword vocabularies are capable of translating or copying unknown words. In this work, we show that they learn to copy words based on both the context in which the words appear as well as features of the words themselves. In contexts that are particularly copy-prone, they even copy words that they have already learned they should translate. We examine the influence of context and subword features on this and other types of copying behavior.
Original languageEnglish
Title of host publicationProceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Place of PublicationBrussels, Belgium
PublisherAssociation for Computational Linguistics
Pages3034-3041
Number of pages8
Publication statusPublished - 31 Oct 2018
Event2018 Conference on Empirical Methods in Natural Language Processing - Square Meeting Center, Brussels, Belgium
Duration: 31 Oct 20184 Nov 2018
http://emnlp2018.org/

Conference

Conference2018 Conference on Empirical Methods in Natural Language Processing
Abbreviated titleEMNLP 2018
CountryBelgium
CityBrussels
Period31/10/184/11/18
Internet address

Fingerprint

Dive into the research topics of 'Context and Copying in Neural Machine Translation'. Together they form a unique fingerprint.

Cite this