Repository logo
 

Measuring Uncertainty in Neural Machine Translation with Similarity-Sensitive Entropy

Accepted version
Peer-reviewed

Change log

Authors

Cheng, J 

Abstract

Uncertainty estimation is an important diagnostic tool for statistical models, and is often used to assess the confidence of model predictions. Previous work shows that neural machine translation (NMT) is an intrinsically uncertain task where there are often multiple correct and semantically equivalent translations, and that well-trained NMT models produce good translations despite spreading probability mass among many semantically similar translations. These findings suggest that popular measures of uncertainty based on token- and sequence-level entropies which measure surface form diversity may not be good proxies of the more useful quantity of interest, semantic diversity. We propose to adapt similarity-sensitive Shannon entropy (S3E), a concept borrowed from theoretical ecology, for NMT. By demonstrating significantly improved correlation between S3E and task performance on quality estimation and named entity recall, we show that S3E is a useful framework for measuring uncertainty in NMT.

Description

Keywords

Journal Title

EACL 2024 - 18th Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Conference

Conference Name

18th Conference of the European Chapter of the Association for Computational Linguistics

Journal ISSN

Volume Title

Publisher

Publisher DOI

Publisher URL

Sponsorship
Huawei donation