Measuring Semantic Relations between Human Activities

Steven Wilson, Rada Mihalcea

Research output: Chapter in Book/Report/Conference proceedingConference contribution


The things people do in their daily lives can provide valuable insights into their personality, values, and interests. Unstructured text data on social media platforms are rich in behavioral content, and automated systems can be deployed to learn about human activity on a broad scale if these systems are able to reason about the content of interest. In order to aid in the evaluation of such systems, we introduce a new phrase-level semantic textual similarity dataset comprised of human activity phrases, providing a testbed for automated systems that analyze relationships between phrasal descriptions of people's actions. Our set of 1,000 pairs of activities is annotated by human judges across four relational dimensions including similarity, relatedness, motivational alignment, and perceived actor congruence. We evaluate a set of strong baselines for the task of generating scores that correlate highly with human ratings, and we introduce several new approaches to the phrase-level similarity task in the domain of human activities.
Original languageEnglish
Title of host publicationThe Eighth International Joint Conference on Natural Language Processing
Subtitle of host publicationProceedings of the Conference, Vol. 1 (Long Papers)
Place of PublicationTaipei, Taiwan
PublisherAsian Federation of Natural Language Processing
Number of pages10
ISBN (Electronic)978-1-948087-00-1
Publication statusPublished - 1 Dec 2017
EventThe 8th International Joint Conference on Natural Language Processing - Taipei, Taiwan, Province of China
Duration: 27 Nov 20171 Dec 2017


ConferenceThe 8th International Joint Conference on Natural Language Processing
Abbreviated titleIJCNLP 2017
CountryTaiwan, Province of China
Internet address

Fingerprint Dive into the research topics of 'Measuring Semantic Relations between Human Activities'. Together they form a unique fingerprint.

Cite this