Widening the Representation Bottleneck in Neural Machine Translation with Lexical Shortcuts

Denis Emelin, Ivan Titov, Rico Sennrich

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

The transformer is a state-of-the-art neural translation model that uses attention to iteratively refine lexical representations with information drawn from the surrounding context. Lexical features are fed into the first layer and propagated through a deep network of hidden layers. We argue that the need to represent and propagate lexical features in each layer limits the model’s capacity for learning and representing other information relevant to the task. To alleviate this bottleneck, we introduce gated shortcut connections between the embedding layer and each subsequent layer within the encoder and decoder.

This enables the model to access relevant lexical content dynamically, without expending limited resources on storing it within intermediate states. We show that the proposed modification yields consistent improvements over a baseline transformer on standard WMT translation tasks in 5 translation directions (0.9 BLEU on average) and reduces the amount of lexical information passed along the hidden layers. We furthermore evaluate different ways to integrate lexical connections into the transformer architecture and present ablation experiments exploring the effect of proposed shortcuts on model behavior.1

1Our code is publicly available to aid the reproduction of the reported results: https://github.com/demelin/ transformer_lexical_shortcuts
Original languageEnglish
Title of host publicationProceedings of the Fourth Conference on Machine Translation (WMT), Volume 1
Subtitle of host publicationResearch Papers
Place of PublicationFlorence, Italy
PublisherAssociation for Computational Linguistics (ACL)
Pages102–115
Number of pages14
Volume1
ISBN (Print)978-1-950737-27-7
DOIs
Publication statusPublished - 2 Aug 2019
EventACL 2019 Fourth Conference on Machine Translation - Florence, Italy
Duration: 1 Aug 20192 Aug 2019
http://www.statmt.org/wmt19/

Conference

ConferenceACL 2019 Fourth Conference on Machine Translation
Abbreviated titleWMT19
Country/TerritoryItaly
CityFlorence
Period1/08/192/08/19
Internet address

Fingerprint

Dive into the research topics of 'Widening the Representation Bottleneck in Neural Machine Translation with Lexical Shortcuts'. Together they form a unique fingerprint.

Cite this