Improving Entity Linking by Modeling Latent Relations between Mentions

Phong Le, Ivan Titov

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Entity linking involves aligning textual mentions of named entities to their corresponding entries in a knowledge base. Entity linking systems often exploit relations between textual mentions in a document (e.g., coreference) to decide if the linking decisions are compatible. Unlike previous approaches, which relied on supervised systems or heuristics to predict these relations, we treat relations as latent variables in our neural entity-linking model. We induce the relations without any supervision while optimizing the entity-linking system in an end-to-end fashion. Our multirelational model achieves the best reported scores on the standard benchmark (AIDACoNLL) and substantially outperforms its relation-agnostic version. Its training also converges much faster, suggesting that the injected structural bias helps to explain regularities in the training data.
Original languageEnglish
Title of host publicationProceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
PublisherAssociation for Computational Linguistics (ACL)
Number of pages10
ISBN (Print)978-1-948087-32-2
DOIs
Publication statusPublished - 31 Jul 2018
Event56th Annual Meeting of the Association for Computational Linguistics - Melbourne Convention and Exhibition Centre, Melbourne, Australia
Duration: 15 Jul 201820 Jul 2018
http://acl2018.org/

Conference

Conference56th Annual Meeting of the Association for Computational Linguistics
Abbreviated titleACL 2018
Country/TerritoryAustralia
CityMelbourne
Period15/07/1820/07/18
Internet address

Cite this