Show simple item record

dc.contributor.authorVulic, Ivanen
dc.contributor.authorGlavaš, Goranen
dc.contributor.authorMrkšić, Nikolaen
dc.contributor.authorKorhonen, Anna-Leenaen
dc.description.abstractWord vector specialisation (also known as retrofitting) is a portable, light-weight approach to fine-tuning arbitrary distributional word vector spaces by injecting external knowledge from rich lexical resources such as WordNet. By design, these post-processing methods only update the vectors of words occurring in external lexicons, leaving the representations of all unseen words intact. In this paper, we show that constraint-driven vector space specialisation can be extended to unseen words. We propose a novel post-specialisation method that: a) preserves the useful linguistic knowledge for seen words; while b) propagating this external signal to unseen words in order to improve their vector representations as well. Our post-specialisation approach explicits a non-linear specialisation function in the form of a deep neural network by learning to predict specialised vectors from their original distributional counterparts. The learned function is then used to specialise vectors of unseen words. This approach, applicable to any post-processing model, yields considerable gains over the initial specialisation models both in intrinsic word similarity tasks, and in two downstream tasks: dialogue state tracking and lexical text simplification. The positive effects persist across three languages, demonstrating the importance of specialising the full vocabulary of distributional word vector spaces.
dc.publisherAssociation for Computational Linguistics
dc.rightsAttribution 4.0 International*
dc.titlePost-Specialisation: Retrofitting Vectors of Words Unseen in Lexical Resourcesen
dc.typeConference Object
rioxxterms.typeConference Paper/Proceeding/Abstracten
pubs.funder-project-idECH2020 EUROPEAN RESEARCH COUNCIL (ERC) (648909)
pubs.conference-nameProceedings of the 16th Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT 2018)en

Files in this item


This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International
Except where otherwise noted, this item's licence is described as Attribution 4.0 International