Repository logo
 

Is "Universal Syntax" Universally Useful for Learning Distributed Word Representations?

Accepted version
Peer-reviewed

Loading...
Thumbnail Image

Change log

Abstract

Recent comparative studies have demonstrated the usefulness of dependency-based contexts (DEPS) for learning distributed word representations for similarity tasks. In English, DEPS tend to perform better than the more common, less informed bag-of-words contexts (BOW). In this paper, we present the first cross-linguistic comparison of different context types for three different languages. DEPS are extracted from universal parses'' without any language-specific optimization. Our results suggest that the universal DEPS (UDEPS) are useful for detecting functional similarity (e.g., verb similarity, solving syntactic analogies) among languages, but their advantage over BOW is not as prominent as previously reported on English. We also show that simple post-parsing'' filtering of useful UDEPS contexts leads to consistent improvements across languages.

Description

Journal Title

Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)

Conference Name

Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)

Journal ISSN

Volume Title

Publisher

Association for Computational Linguistics (ACL)

Rights and licensing

Except where otherwised noted, this item's license is described as http://creativecommons.org/licenses/by/4.0/