Type-aware Embeddings for Multi-Hop Reasoning over Knowledge Graphs

Zhiwei Hu, ‪Víctor Gutiérrez-Basulto, Zhiliang Xiang, Xiaoli Li, Ru Li, Jeff Z Pan

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Multi-hop reasoning over real-life knowledge graphs (KGs) is a highly challenging problem as traditional subgraph matching methods are not capable to deal with noise and missing information. To address this problem, it has been recently introduced a promising approach based on jointly embedding logical queries and KGs into a low-dimensional space to identify answer entities. However, existing proposals ignore critical semantic knowledge inherently available in KGs, such as type information. To leverage type information, we propose a novel TypE-aware Message Passing (TEMP) model, which enhances the entity and relation representations in queries, and simultaneously improves generalization, deductive and inductive reasoning. Remarkably, TEMP is a plug-and-play model that can be easily incorporated into existing embedding-based models to improve their performance. Extensive experiments on three real-world datasets demonstrate TEMP’s effectiveness.
Original languageEnglish
Title of host publicationProceedings of the 31st International Joint Conference on Artifical Intelligence, IJCAI-ECAI 2022
EditorsLuc De Raedt
PublisherInternational Joint Conferences on Artificial Intelligence Organization
Pages3078-3084
Number of pages10
ISBN (Electronic)978-1-956792-00-3
DOIs
Publication statusPublished - 23 Jul 2022
Event31st International Joint conference on Artificial Intelligence - Vienna, Austria
Duration: 23 Jul 202229 Jul 2022
https://ijcai-22.org/

Conference

Conference31st International Joint conference on Artificial Intelligence
Abbreviated titleIJCAI-ECAI 2022
Country/TerritoryAustria
CityVienna
Period23/07/2229/07/22
Internet address

Fingerprint

Dive into the research topics of 'Type-aware Embeddings for Multi-Hop Reasoning over Knowledge Graphs'. Together they form a unique fingerprint.

Cite this