Learning Typed Entailment Graphs with Global Soft Constraints

Seyed mohammad javad Hosseini, Nathanael Chambers, Siva Reddy, Xavier R. Holt, Shay Cohen, Mark Johnson, Mark Steedman

Research output: Contribution to journalArticlepeer-review

Abstract

This paper presents a new method for learning typed entailment graphs from text. We extract predicate-argument structures from multiple-source news corpora, and compute local distributional similarity scores to learn entailments between predicates with typed arguments (e.g., person contracted disease). Previous work has used transitivity constraints to improve local decisions, but these constraints are intractable on large graphs. We instead propose a scalable method that learns globally consistent similarity scores based on new soft constraints that consider both the structures across typed entailment graphs and inside each graph. Learning takes only a few hours to run over 100K predicates and our results show large improvements over local similarity scores on two entailment datasets. We further show improvements over paraphrases and entailments from the Paraphrase Database, and prior state-of-the-art entailment graphs. We show that the entailment graphs improve performance in a downstream task.
Original languageEnglish
Pages (from-to)703-717
Number of pages16
JournalTransactions of the Association for Computational Linguistics
Volume6
DOIs
Publication statusPublished - Dec 2018

Fingerprint

Dive into the research topics of 'Learning Typed Entailment Graphs with Global Soft Constraints'. Together they form a unique fingerprint.

Cite this