Edinburgh Research Explorer

On the Importance of Word Boundaries in Character-level Neural Machine Translation

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Related Edinburgh Organisations

Open Access permissions

Open

Documents

  • Download as Adobe PDF

    Final published version, 964 KB, PDF document

    Licence: Creative Commons: Attribution (CC-BY)

https://www.aclweb.org/anthology/D19-5619
Original languageEnglish
Title of host publicationProceedings of the 3rd Workshop on Neural Generation and Translation
Place of PublicationHong Kong
PublisherAssociation for Computational Linguistics
Pages187-193
Number of pages7
ISBN (Electronic)78-1-950737-83-3
DOIs
Publication statusPublished - 4 Nov 2019
EventThe 3rd Workshop on Neural Generation and Translation: at EMNLP-IJCNLP 2019 - Hong Kong, Hong Kong
Duration: 4 Nov 20194 Nov 2019
https://sites.google.com/view/wngt19/home

Workshop

WorkshopThe 3rd Workshop on Neural Generation and Translation
Abbreviated titleWNGT 2019
CountryHong Kong
CityHong Kong
Period4/11/194/11/19
Internet address

Abstract

Neural Machine Translation (NMT) models generally perform translation using a fixed-size lexical vocabulary, which is an important bottleneck on their generalization capability and overall translation quality. The standard approach to overcome this limitation is to segment words into subword units, typically using some external tools with arbitrary heuristics, resulting in vocabulary units not optimized for the translation task. Recent studies have shown that the same approach can be extended to perform NMT directly at the level of characters, which can deliver translation accuracy on-par with subword-based models, on the other hand, this requires relatively deeper networks. In this paper, we propose a more computationally-efficient solution for character-level NMT which implements a hierarchical decoding architecture where translations are subsequently generated at the level of words and characters. We evaluate different methods for open-vocabulary NMT in the machine translation task from English into five languages with distinct morphological typology, and show that the hierarchical decoding model can reach higher translation accuracy than the subword-level NMT model using significantly fewer parameters, while demonstrating better capacity in learning longer-distance contextual and grammatical dependencies than the standard character-level NMT model.

Event

The 3rd Workshop on Neural Generation and Translation: at EMNLP-IJCNLP 2019

4/11/194/11/19

Hong Kong, Hong Kong

Event: Workshop

Download statistics

No data available

ID: 131155788