Edinburgh Neural Machine Translation Systems for WMT 16

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We participated in the WMT 2016 shared news translation task by building neural translation systems for four language pairs, each trained in both directions: English↔Czech, English↔German, English↔Romanian and English↔Russian. Our systems are based on an attentional encoder-decoder, using BPE subword segmentation for open-vocabulary translation with a fixed vocabulary. We experimented with using automatic back-translations of the monolingual News corpus as additional training data, pervasive dropout, and target-bidirectional models. All reported methods give substantial improvements, and we see improvements of 4.3–11.2 BLEU over our baseline systems. In the human evaluation, our systems were the (tied) best constrained system for 7 out of 8 translation directions in which we participated.
Original languageEnglish
Title of host publicationProceedings of the First Conference on Machine Translation, Volume 2: Shared Task Papers
Place of PublicationBerlin, Germany
PublisherAssociation for Computational Linguistics
Pages371-376
Number of pages6
DOIs
Publication statusPublished - 12 Aug 2016
EventFirst Conference on Machine Translation - Berlin, Germany
Duration: 11 Aug 201612 Aug 2016
http://www.statmt.org/wmt16/

Conference

ConferenceFirst Conference on Machine Translation
Abbreviated titleWMT16
Country/TerritoryGermany
CityBerlin
Period11/08/1612/08/16
Internet address

Cite this