TuckER: Tensor Factorization for Knowledge Graph Completion

Ivana Balazevic, Carl Allen, Timothy Hospedales

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

Knowledge graphs are structured representations of real world facts. However, they typically contain only a small subset of all possible facts. Link prediction is a task of inferring missing facts based on existing ones. We propose TuckER, a relatively straightforward but powerful linear model based on Tucker decomposition of the binary tensor representation of knowledge graph triples. TuckER outperforms previous state-of-the-art models across standard link prediction datasets, acting as a strong baseline for more elaborate models. We show that TuckER is a fully expressive model, derive sufficient bounds on its embedding dimensionalities and demonstrate that several previously introduced linear models can be viewed as special cases of TuckER.
Original languageEnglish
Title of host publicationProceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing
PublisherAssociation for Computational Linguistics
Pages5184-5193
Number of pages11
ISBN (Print)978-1-950737-90-1
DOIs
Publication statusPublished - 4 Nov 2019
Event2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing - Hong Kong, Hong Kong
Duration: 3 Nov 20197 Nov 2019
https://www.emnlp-ijcnlp2019.org/

Conference

Conference2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing
Abbreviated titleEMNLP-IJCNLP 2019
Country/TerritoryHong Kong
CityHong Kong
Period3/11/197/11/19
Internet address

Fingerprint

Dive into the research topics of 'TuckER: Tensor Factorization for Knowledge Graph Completion'. Together they form a unique fingerprint.

Cite this