Toward Making the Most of Context in Neural Machine Translation

Zaixiang Zheng, Yue Xiang, Shujian Huang, Jiajun Chen, Alexandra Birch-Mayne

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Document-level machine translation manages to outperform sentence level models by a small margin, but have failed to be widely adopted. We argue that previous research did not make a clear use of the global context, and propose a new document-level NMT framework that deliberately models the local context of each sentence with the awareness of the global context of the document in both source and target languages. We specifically design the model to be able to deal with documents containing any number of sentences, including single sentences. This unified approach allows our model to be trained elegantly on standard datasets without needing to train on sentence and document level data separately. Experimental results demonstrate that our model outperforms Transformer baselines and previous document-level NMT models with substantial margins of up to 2.1 BLEU on state-of-the-art baselines. We also provide analyses which show the benefit of context far beyond the neighboring two or three sentences, which previous studies have typically incorporated.
Original languageEnglish
Title of host publicationProceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, IJCAI-20
EditorsChristian Bessiere
PublisherInternational Joint Conferences on Artificial Intelligence Organization
Number of pages7
ISBN (Print)978-0-9992411-6-5
Publication statusPublished - 17 Jul 2020
Event29th International Joint Conference in Artificial Intelligence - Yokohama, Japan
Duration: 11 Jul 202017 Jul 2020
Conference number: 29


Conference29th International Joint Conference in Artificial Intelligence
Abbreviated titleIJCAI 2020
Internet address

Cite this