Long-Range Synthetic Knowledge Graph Benchmarks for Double-Equivariant Models

Abstract

In the landscape of relational data, Knowledge Graphs (KGs) structure triplets of the form (head entity, relation, tail entity). Recent methods, namely double-equivariant models, have considered the task of predicting missing triplets under fully-inductive scenarios involving both new entities and novel relations during testing. Despite their great promise, a consensus on their practical capabilities, particularly in capturing long-range dependencies, remains elusive. This paper investigates the ability of double-equivariant models to capture long-range dependencies in an input KG, and transfer knowledge to a new test KG, which also requires distant information. We present multiple synthetic yet semantically sound datasets which require distant information for accurate predictions. Our preliminary empirical results highlight that existing double-equivariant models face significant challenges in effectively incorporating distant information.

Publication
ICLR 2024 Workshop BGPT
Jincheng Zhou
Jincheng Zhou
Ph.D. Student

My research interests include graph representation learning and applications to knowledge graphs, causal inference and causal structural discovery, meta learning, cognitive architectures, and artificial general intelligence.