Repository logo
 

Analogy Training Multilingual Encoders

Accepted version
Peer-reviewed

Loading...
Thumbnail Image

Type

Conference Object

Change log

Authors

Garneau, N 
Hartmann, M 
Sandholm, A 
Ruder, S 
Vulić, I 

Abstract

Language encoders encode words and phrases in ways that capture their local semantic relatedness, but are known to be globally inconsistent. Global inconsistency can seemingly be corrected for, in part, by leveraging signals from knowledge bases, but previous results are partial and limited to monolingual English encoders. We extract a large-scale multilingual, multi-word analogy dataset from Wikidata for diagnosing and correcting for global inconsistencies and implement a four-way Siamese BERT architecture for grounding multilingual BERT (mBERT) in Wikidata through analogy training. We show that analogy training not only improves the global consistency of mBERT, as well as the isomorphism of language-specific subspaces, but also leads to significant gains on downstream tasks such as bilingual dictionary induction and sentence retrieval.

Description

Keywords

Journal Title

35th AAAI Conference on Artificial Intelligence, AAAI 2021

Conference Name

Proceedings of the 35th AAAI Conference on Artificial Intelligence (AAAI 2021)

Journal ISSN

2159-5399
2374-3468

Volume Title

35

Publisher

Rights

All rights reserved
Sponsorship
European Research Council (648909)