Non-projective Dependency-based Pre-Reordering with Recurrent Neural Network for Machine Translation

Antonio Miceli Barone, Giuseppe Attardi

Research output: Chapter in Book/Report/Conference proceedingConference contribution


The quality of statistical machine translation performed with phrase based approaches can be increased by permuting the words in the source sentences in an order which resembles that of the target language. We propose a class of recurrent neural models which exploit sourceside dependency syntax features to reorder the words into a target-like order. We evaluate these models on the German to-English
language pair, showing significant improvements over a phrase-based
Moses baseline, obtaining a quality similar or superior to that of hand-coded syntactical reordering rules.
Original languageEnglish
Title of host publicationProceedings of SSST-9, Ninth Workshop on Syntax, Semantics and Structure in Statistical Translation
PublisherAssociation for Computational Linguistics (ACL)
Number of pages11
ISBN (Print)978-1-941643-41-9
Publication statusPublished - Jun 2015


Dive into the research topics of 'Non-projective Dependency-based Pre-Reordering with Recurrent Neural Network for Machine Translation'. Together they form a unique fingerprint.

Cite this