IITP-MT at WAT2018: Transformer-based Multilingual Indic-English Neural Machine Translation System

Sukanta Sen, Kamal Kumar Gupta, Asif Ekbal, Pushpak Bhattacharyya

Research output: Chapter in Book/Report/Conference proceedingConference contribution


This paper describes the systems submitted by the IITP-MT team to WAT 2018 multilingual Indic languages shared task. We submit two multilingual neural machine translation (NMT) systems (Indic-to-English and English-to-Indic) based on Transformer architecture and our approaches are similar to many-to-one and one-to-many approaches of Johnson et al. (2017). We also train separate bilingual models as baselines for all translation directions involving English. We evaluate the models using BLEU score and find that a single multilingual NMT model performs better (up to 14.81 BLEU) than separate bilingual models when the target is English. However, when English is the source language, multi-lingual NMT model improves only for low-resource language pairs (up to 11.60 BLEU) and degrades for relatively high-resource language pairs over separate bilingual models.
Original languageEnglish
Title of host publicationProceedings of the 32nd Pacific Asia Conference on Language, Information and Computation: 5th Workshop on Asian Translation
Place of PublicationHong Kong
PublisherAssociation for Computational Linguistics
Number of pages5
Publication statusPublished - 1 Dec 2018
EventThe 5th Workshop on Asian Translation @ PACLIC 32 - , Hong Kong
Duration: 3 Dec 20183 Dec 2018


WorkshopThe 5th Workshop on Asian Translation @ PACLIC 32
Abbreviated titleWAT 2018
Country/TerritoryHong Kong
Internet address

Cite this