A Verb Lexicon Model with Source-side Syntactic Context for String-to-Tree Machine Translation

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

String-to-tree MT systems translate verbs without lexical or syntactic context on the source side and with limited targetside context. The lack of context is one reason why verb translation recall is as low as 45.5%.
We propose a verb lexicon model trained with a feedforward neural network that predicts the target verb conditioned on a wide source-side context. We show that a syntactic context extracted from the dependency parse of the source sentence improves the model’s accuracy by 1.5% over a baseline trained on a window context.
When used as an extra feature for re-ranking the n-best list produced by the string-to-tree MT system, the verb lexicon model improves verb translation recall by more than 7%.
Original languageEnglish
Title of host publicationProceedings of the International Workshop on Spoken Language Translation (IWSLT)
Number of pages9
Publication statusPublished - 1 Dec 2016
Event13th International Workshop on Spoken Language Translation 2016 - Seattle, United States
Duration: 8 Dec 20169 Dec 2016
https://workshop2016.iwslt.org/

Conference

Conference13th International Workshop on Spoken Language Translation 2016
Abbreviated titleIWSLT 2016
Country/TerritoryUnited States
CitySeattle
Period8/12/169/12/16
Internet address

Fingerprint

Dive into the research topics of 'A Verb Lexicon Model with Source-side Syntactic Context for String-to-Tree Machine Translation'. Together they form a unique fingerprint.

Cite this