Abstract
Knowledge graphs are structured representations of real world facts. However, they typically contain only a small subset of all possible facts. Link prediction is a task of inferring missing facts based on existing ones. We propose TuckER, a relatively straightforward but powerful linear model based on Tucker decomposition of the binary tensor representation of knowledge graph triples. TuckER outperforms previous state-of-the-art models across standard link prediction datasets, acting as a strong baseline for more elaborate models. We show that TuckER is a fully expressive model, derive sufficient bounds on its embedding dimensionalities and demonstrate that several previously introduced linear models can be viewed as special cases of TuckER.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing |
Publisher | Association for Computational Linguistics |
Pages | 5184-5193 |
Number of pages | 11 |
ISBN (Print) | 978-1-950737-90-1 |
DOIs | |
Publication status | Published - 4 Nov 2019 |
Event | 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing - Hong Kong, Hong Kong Duration: 3 Nov 2019 → 7 Nov 2019 https://www.emnlp-ijcnlp2019.org/ |
Conference
Conference | 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing |
---|---|
Abbreviated title | EMNLP-IJCNLP 2019 |
Country/Territory | Hong Kong |
City | Hong Kong |
Period | 3/11/19 → 7/11/19 |
Internet address |