Abstract
The quality of statistical machine translation performed with phrase based approaches can be increased by permuting the words in the source sentences in an order which resembles that of the target language. We propose a class of recurrent neural models which exploit sourceside dependency syntax features to reorder the words into a target-like order. We evaluate these models on the German to-English
language pair, showing significant improvements over a phrase-based
Moses baseline, obtaining a quality similar or superior to that of hand-coded syntactical reordering rules.
language pair, showing significant improvements over a phrase-based
Moses baseline, obtaining a quality similar or superior to that of hand-coded syntactical reordering rules.
Original language | English |
---|---|
Title of host publication | Proceedings of SSST-9, Ninth Workshop on Syntax, Semantics and Structure in Statistical Translation |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 10-20 |
Number of pages | 11 |
ISBN (Print) | 978-1-941643-41-9 |
Publication status | Published - Jun 2015 |