Semantic Composition via Probabilistic Model Theory
View / Open Files
Editors
Post, M
Publication Date
2017-09-21Journal Title
IWCS 2017 - 12th International Conference on Computational Semantics
Conference Name
The 12th International Conference on Computational Semantics
Publisher
ACL Anthology
Language
English
Type
Conference Object
This Version
AM
Metadata
Show full item recordCitation
Emerson, G., & Copestake, A. (2017). Semantic Composition via Probabilistic Model Theory. IWCS 2017 - 12th International Conference on Computational Semantics https://www.aclweb.org/anthology/W17-6806.pdf
Abstract
Semantic composition remains an open problem for vector space models of semantics. In this paper, we explain how the probabilistic graphical model used in the framework of Functional Distributional Semantics can be interpreted as a probabilistic version of model theory. Building on this, we explain how various semantic phenomena can be recast in terms of conditional probabilities in the graphical model. This connection between formal semantics and machine learning is helpful in both directions: it gives us an explicit mechanism for modelling context-dependent meanings (a challenge for formal semantics), and also gives us well-motivated techniques for composing distributed representations (a challenge for distributional semantics). We present results on two datasets that go beyond word similarity, showing how these semantically-motivated techniques improve on the performance of vector models.
Keywords
cs.CL, cs.CL
Sponsorship
Schiff Foundation
Identifiers
External link: https://www.aclweb.org/anthology/W17-6806.pdf
This record's URL: https://www.repository.cam.ac.uk/handle/1810/270188
Rights
Licence:
http://creativecommons.org/licenses/by/4.0/