One Representation per Word - Does it make Sense for Composition?

Thomas Kober, Julie Weeds, John Wilkie, Jeremy Reffin, David Weir

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper, we investigate whether an a priori disambiguation of word senses is strictly necessary or whether the meaning of a word in context can be disambiguated through composition alone. We evaluate the performance of off-the-shelf singlevector
and multi-sense vector models on a benchmark phrase similarity task and a novel task for word-sense discrimination. We find that single-sense vector models perform as well or better than multi-sense vector models despite arguably less clean elementary representations. Our findings furthermore show that simple composition functions such as pointwise addition are able to recover sense specific information from a single-sense vector model remarkably well.
Original languageEnglish
Title of host publicationProceedings of the 1st Workshop on Sense, Concept and Entity Representations and their Applications
PublisherAssociation for Computational Linguistics
Pages79-90
Number of pages12
DOIs
Publication statusPublished - 4 Apr 2017
EventSENSE: The first workshop on Sense, Concept and Entity Representations and their Applications - Valencia, Spain
Duration: 4 Apr 2017 → …
https://sites.google.com/site/senseworkshop2017/home

Workshop

WorkshopSENSE: The first workshop on Sense, Concept and Entity Representations and their Applications
Abbreviated titleSENSE 2017
Country/TerritorySpain
CityValencia
Period4/04/17 → …
Internet address

Fingerprint

Dive into the research topics of 'One Representation per Word - Does it make Sense for Composition?'. Together they form a unique fingerprint.

Cite this