Show simple item record

dc.contributor.authorLiu, Q
dc.contributor.authorLiu, Fangyu
dc.contributor.authorCollier, Nigel
dc.contributor.authorKorhonen, Anna-Leena
dc.contributor.authorVulić, I
dc.date.accessioned2021-11-25T00:30:08Z
dc.date.available2021-11-25T00:30:08Z
dc.identifier.isbn9781955917056
dc.identifier.urihttps://www.repository.cam.ac.uk/handle/1810/331050
dc.description.abstractRecent work indicated that pretrained language models (PLMs) such as BERT and RoBERTa can be transformed into effective sentence and word encoders even via simple self-supervised techniques. Inspired by this line of work, in this paper we propose a fully unsupervised approach to improving word-in-context (WiC) representations in PLMs, achieved via a simple and efficient WiC-targeted fine-tuning procedure: MirrorWiC. The proposed method leverages only raw texts sampled from Wikipedia, assuming no sense-annotated data, and learns context-aware word representations within a standard contrastive learning setup. We experiment with a series of standard and comprehensive WiC benchmarks across multiple languages. Our proposed fully unsupervised MirrorWiC models obtain substantial gains over off-the-shelf PLMs across all monolingual, multilingual and cross-lingual setups. Moreover, on some standard WiC benchmarks, MirrorWiC is even on-par with supervised models fine-tuned with in-task data and sense labels.
dc.publisherAssociation for Computational Linguistics
dc.rightsAttribution 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectcs.CL
dc.subjectcs.CL
dc.titleMIRRORWIC: On Eliciting Word-in-Context Representations from Pretrained Language Models
dc.typeConference Object
prism.publicationNameCoNLL 2021 - 25th Conference on Computational Natural Language Learning, Proceedings
dc.identifier.doi10.17863/CAM.78495
dcterms.dateAccepted2021-08-31
rioxxterms.versionofrecord10.17863/CAM.78495
rioxxterms.versionVoR
rioxxterms.licenseref.urihttp://www.rioxx.net/licenses/all-rights-reserved
rioxxterms.licenseref.startdate2021-08-31
dc.contributor.orcidLiu, Fangyu [0000-0001-7038-3623]
dc.contributor.orcidCollier, Nigel [0000-0002-7230-4164]
dc.publisher.urlhttps://aclanthology.org/2021.conll-1.44
rioxxterms.typeConference Paper/Proceeding/Abstract
cam.issuedOnline2021-11
pubs.conference-name25th Conference on Computational Natural Language Learning (CoNLL 2021)
pubs.conference-start-date2021-11-10
pubs.conference-finish-date2021-11-11


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International
Except where otherwise noted, this item's licence is described as Attribution 4.0 International