Iterative Back-Translation for Neural Machine Translation

Cong Duy Vu Hoang, Philipp Koehn, Gholamreza Haffari, Trevor Cohn

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

We present iterative back-translation, a method for generating increasingly better synthetic parallel data from monolingual data to train neural machine translation systems. Our proposed method is very simple yet effective and highly applicable in practice. We demonstrate improvements in neural machine translation quality in both high and low resourced scenarios, including the best reported BLEU scores for the WMT 2017 German↔English tasks.
Original languageEnglish
Title of host publicationProceedings of the 2nd Workshop on Neural Machine Translation and Generation
Place of PublicationMelbourne, Australia
PublisherAssociation for Computational Linguistics
Number of pages7
Publication statusPublished - 20 Jul 2018
Event2nd Workshop on Neural Machine Translation and Generation - Melbourne, Australia
Duration: 15 Jul 201820 Jul 2018


Conference2nd Workshop on Neural Machine Translation and Generation
Abbreviated titleWNMT 2018
Internet address


Dive into the research topics of 'Iterative Back-Translation for Neural Machine Translation'. Together they form a unique fingerprint.

Cite this