Neural Machine Translation of Rare Words with Subword Units

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Neural machine translation (NMT) models typically operate with a fixed vocabulary, so the translation of rare and unknown words is an open problem. Previous work addresses this problem through back-off dictionaries. In this paper, we introduce a simpler and more effective approach, making the NMT model capable of open-vocabulary translation by encoding rare and unknown words as sequences of subword units, based on the intuition that various word classes are translatable via smaller units than words, for instance names (via character copying or transliteration), compounds (via compositional translation), and cognates and loanwords (via phonological and morphological transformations). We discuss the suitability of different word segmentation techniques, including simple character n-gram models and a segmentation based on the byte pair encoding compression algorithm, and empirically show that subword models improve over a back-off dictionary baseline for the WMT 15 translation tasks English→German and English→Russian by 1.1 and 1.3 BLEU, respectively.
Original languageEnglish
Title of host publicationProceedings of the 54th Annual Meeting of the Association for Computational Linguistics
Place of PublicationBerlin, Germany
PublisherAssociation for Computational Linguistics (ACL)
Pages1715-1725
Number of pages11
Volume1: Long Papers
ISBN (Print)978-1-945626-00-5
DOIs
Publication statusPublished - 12 Aug 2016
Event54th Annual Meeting of the Association for Computational Linguistics - Berlin, Germany
Duration: 7 Aug 201612 Aug 2016
https://mirror.aclweb.org/acl2016/

Conference

Conference54th Annual Meeting of the Association for Computational Linguistics
Abbreviated titleACL 2016
CountryGermany
CityBerlin
Period7/08/1612/08/16
Internet address

Fingerprint Dive into the research topics of 'Neural Machine Translation of Rare Words with Subword Units'. Together they form a unique fingerprint.

Cite this