Blindness to Modality Helps Entailment Graph Mining

Liane Guillou, Sander Bijl de Vroe, Mark Johnson, Mark Steedman

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Understanding linguistic modality is widely seen as important for downstream tasks such as Question Answering and Knowledge Graph Population. Entailment Graph learning might also be expected to benefit from attention to modality. We build Entailment Graphs using a news corpus filtered with a modality parser, and show that stripping modal modifiers from predicates in fact increases performance. This suggests that for some tasks, the pragmatics of modal modification of predicates allows them to contribute as evidence of entailment.
Original languageEnglish
Title of host publicationProceedings of the Second Workshop on Insights from Negative Results in NLP
EditorsJoão Sedoc, Anna Rogers, Anna Rumshisky, Shabnam Tafreshi
Place of PublicationStroudsburg, PA, United States
PublisherAssociation for Computational Linguistics (ACL)
Pages110-116
Number of pages7
ISBN (Electronic)978-1-954085-93-0
Publication statusPublished - 10 Nov 2021
EventWorkshop on Insights from Negative Results in NLP - Punta Cana, Dominican Republic
Duration: 10 Nov 202110 Nov 2021
https://insights-workshop.github.io/2021/

Workshop

WorkshopWorkshop on Insights from Negative Results in NLP
Country/TerritoryDominican Republic
CityPunta Cana
Period10/11/2110/11/21
Internet address

Fingerprint

Dive into the research topics of 'Blindness to Modality Helps Entailment Graph Mining'. Together they form a unique fingerprint.

Cite this