Repository logo
 

Elastic weight consolidation for better bias inoculation

Accepted version
Peer-reviewed

Type

Conference Object

Change log

Authors

Thorne, James 

Abstract

The biases present in training datasets have been shown to affect models for sentence pair classification tasks such as natural language inference (NLI) and fact verification. While fine-tuning models on additional data has been used to mitigate them, a common issue is that of catastrophic forgetting of the original training dataset. In this paper, we show that elastic weight consolidation (EWC) allows finetuning of models to mitigate biases while being less susceptible to catastrophic forgetting. In our evaluation on fact verification and NLI stress tests, we show that fine-tuning with EWC dominates standard fine-tuning, yielding models with lower levels of forgetting on the original (biased) dataset for equivalent gains in accuracy on the fine-tuning (unbiased) dataset.

Description

Keywords

Journal Title

16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021)

Conference Name

6TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS

Journal ISSN

Volume Title

Publisher

Publisher DOI

Publisher URL

Sponsorship
European Commission Horizon 2020 (H2020) ERC (865958)
Engineering and Physical Sciences Research Council (1905693)