Repository logo
 

Using Context in Neural Machine Translation Training Objectives

Accepted version
Peer-reviewed

Type

Conference Object

Change log

Authors

Stahlberg, Felix 
Byrne, Bill 

Abstract

We present Neural Machine Translation (NMT) training using document-level metrics with batch-level documents. Previous sequence-objective approaches to NMT training focus exclusively on sentence-level metrics like sentence BLEU which do not correspond to the desired evaluation metric, typically document BLEU. Meanwhile research into document-level NMT training focuses on data or model architecture rather than training procedure. We find that each of these lines of research has a clear space in it for the other, and propose merging them with a scheme that allows a document-level evaluation metric to be used in the NMT training objective. We first sample pseudo-documents from sentence samples. We then approximate the expected document BLEU gradient with Monte Carlo sampling for use as a cost function in Minimum Risk Training (MRT). This two-level sampling procedure gives NMT performance gains over sequence MRT and maximum-likelihood training. We demonstrate that training is more robust for document-level metrics than with sequence metrics. We further demonstrate improvements on NMT with TER and Grammatical Error Correction (GEC) using GLEU, both metrics used at the document level for evaluations.

Description

Keywords

cs.CL, cs.CL

Journal Title

Conference Name

2020 Annual Conference of the Association for Computational Linguistics

Journal ISSN

Volume Title

Publisher

Rights

All rights reserved
Sponsorship
EPSRC (1750003)