Double Equivariance for Inductive Link Prediction for Both New Nodes and New Relation Types

Abstract

The task of inductive link prediction in discrete attributed multigraphs (e.g., knowledge graphs, multilayer networks, heterogeneous networks, etc.) generally focuses on test predictions with solely new nodes but not both new nodes and new relation types. In this work, we formally define the task of predicting (completely) new nodes and new relation types in test as a doubly inductive link prediction task and introduce a theoretical framework for the solution. We start by defining the concept of double permutation-equivariant representations that are equivariant to permutations of both node identities and edge relation types. We then propose a general blueprint to design neural architectures that impose a structural representation of relations that can inductively generalize from training nodes and relations to arbitrarily new test nodes and relations without the need for adaptation, side information, or retraining. We also introduce the concept of distributionally double equivariant positional embeddings designed to perform the same task. Finally, we empirically demonstrate the capability of the two proposed models on a set of novel real-world benchmarks, showcasing average relative performance gains of 39.65% on predicting new relations types compared to baselines.

Publication
NeurIPS 2023 New Frontiers in Graph Learning (GLFrontiers) (Oral)
Jincheng Zhou
Jincheng Zhou
Ph.D. Student

My research interests include graph representation learning and applications to knowledge graphs, causal inference and causal structural discovery, meta learning, cognitive architectures, and artificial general intelligence.