Repository logo
 

Words are vectors, dependencies are matrices: Learning word embeddings from dependency graphs

Published version
Peer-reviewed

Type

Conference Object

Change log

Authors

Czarnowska, P 

Abstract

Distributional Semantic Models (DSMs) construct vector representations of word meanings based on their contexts. Typically, the contexts of a word are defined as its closest neighbours, but they can also be retrieved from its syntactic dependency relations. In this work, we propose a new dependency-based DSM. The novelty of our model lies in associating an independent meaning representation, a matrix, with each dependency-label. This allows it to capture specifics of the relations between words and contexts, leading to good performance on both intrinsic and extrinsic evaluation tasks. In addition to that, our model has an inherent ability to represent dependency chains as products of matrices which provides a straightforward way of handling further contexts of a word.

Description

Keywords

Journal Title

IWCS 2019 - Proceedings of the 13th International Conference on Computational Semantics - Long Papers

Conference Name

Proceedings of the 13th International Conference on Computational Semantics - Long Papers

Journal ISSN

Volume Title

Publisher

Association for Computational Linguistics