A Latent Morphology Model for Open-Vocabulary Neural Machine Translation

Duygu Ataman, Wilker Aziz, Alexandra Birch

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

Translation into morphologically-rich languages challenges neural machinetranslation (NMT) models with extremely sparse vocabularies where atomic treatment of surface forms is unrealistic. This problem is typically addressed by either pre-processing words into subword units or performing translation directly at the level of characters. The former is based on word segmentation algorithms optimized using corpus-level statistics with no regard to the translation task. The latter learns directly from translation data but requires rather deep architectures. In this paper, we propose to translate words by modeling word formation through a hierarchical latent variable model which mimics the process of morphological inflection. Our model generates words one character at a time by composing two latent representations: a continuous one, aimed at capturing the lexical semantics, and a set of (approximately) discrete features, aimed at capturing the morphosyn-tactic function, which are shared among different surface forms. Our model achieves better accuracy in translation into three morphologically-rich languages than conventional open-vocabulary NMT methods, while also demonstrating a better generalization capacity under low to mid-resource settings.
Original languageEnglish
Title of host publicationProceedings of the International Conference on Learning Representations 2020
Pages1-15
Number of pages15
Publication statusPublished - 30 Apr 2020
EventEighth International Conference on Learning Representations - Millennium Hall, Virtual conference formerly Addis Ababa, Ethiopia
Duration: 26 Apr 202030 Apr 2020
https://iclr.cc/Conferences/2020

Conference

ConferenceEighth International Conference on Learning Representations
Abbreviated titleICLR 2020
Country/TerritoryEthiopia
CityVirtual conference formerly Addis Ababa
Period26/04/2030/04/20
Internet address

Fingerprint

Dive into the research topics of 'A Latent Morphology Model for Open-Vocabulary Neural Machine Translation'. Together they form a unique fingerprint.

Cite this