Repository logo
 

Neural Machine Translation by Minimising the Bayes-risk with Respect to Syntactic Translation Lattices

Published version
Peer-reviewed

Change log

Authors

de Gispert, A 
Hasler, E 
Byrne, W 

Abstract

We present a novel scheme to combine neural machine translation (NMT) with traditional statistical machine translation (SMT). Our approach borrows ideas from linearised lattice minimum Bayes-risk decoding for SMT. The NMT score is combined with the Bayes-risk of the translation according the SMT lattice. This makes our approach much more flexible than n-best list or lattice rescoring as the neural decoder is not restricted to the SMT search space. We show an efficient and simple way to integrate risk estimation into the NMT decoder which is suitable for word-level as well as subword-unit-level NMT. We test our method on English-German and Japanese-English and report significant gains over lattice rescoring on several data sets for both single and ensembled NMT. The MBR decoder produces entirely new hypotheses far beyond simply rescoring the SMT search space or fixing UNKs in the NMT output.

Description

Keywords

cs.CL, cs.CL

Journal Title

Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics

Conference Name

15th Conference of the European Chapter of the Association for Computational Linguistics

Journal ISSN

Volume Title

2, Short Papers

Publisher

Association for Computational Linguistics
Sponsorship
This work was supported by the U.K. Engineering and Physical Sciences Research Council (EPSRC grant EP/L027623/1).