Projects per year
Abstract / Description of output
In this work, we explore multiple neural architectures adapted for the task of automatic post-editing of machine translation output. We focus on neural endto-end models that combine both inputs mt (raw MT output) and src (source language input) in a single neural architecture, modeling {mt, src} → pe directly. Apart from that, we investigate the influence of hard-attention models which seem to be well-suited for monolingual tasks, as well as combinations of both ideas. We report results on data sets provided during the WMT-2016 shared task on automatic post-editing and can demonstrate that dual-attention models that incorporate all available data in the APE scenario in a single model improve on the best shared task system and on all other published results after the shared task. Dual-attention models that are combined with hard attention remain competitive despite applying fewer changes to the input.
Original language | English |
---|---|
Title of host publication | The 8th International Joint Conference on Natural Language Processing (IJCNLP 2017) |
Publisher | Asian Federation of Natural Language Processing |
Pages | 120-129 |
Number of pages | 10 |
Volume | 1 |
Publication status | Published - 1 Dec 2017 |
Event | The 8th International Joint Conference on Natural Language Processing - Taipei, Taiwan, Province of China Duration: 27 Nov 2017 → 1 Dec 2017 http://ijcnlp2017.org/site/page.aspx?pid=901&sid=1133&lang=en |
Conference
Conference | The 8th International Joint Conference on Natural Language Processing |
---|---|
Abbreviated title | IJCNLP 2017 |
Country/Territory | Taiwan, Province of China |
City | Taipei |
Period | 27/11/17 → 1/12/17 |
Internet address |
Fingerprint
Dive into the research topics of 'An Exploration of Neural Sequence-to-Sequence Architectures for Automatic Post-Editing'. Together they form a unique fingerprint.Projects
- 2 Finished
-
Translation for Massive Open Online Courses- TraMooc
Koehn, P. & Birch-Mayne, A.
1/02/15 → 31/01/18
Project: Research
-