Enhancing Knowledge Graph Completion with GNN Distillation and Probabilistic Interaction Modeling
By: Lingzhi Wang , Pengcheng Huang , Haotian Li and more
Potential Business Impact:
Helps computers understand missing facts in data.
Knowledge graphs (KGs) serve as fundamental structures for organizing interconnected data across diverse domains. However, most KGs remain incomplete, limiting their effectiveness in downstream applications. Knowledge graph completion (KGC) aims to address this issue by inferring missing links, but existing methods face critical challenges: deep graph neural networks (GNNs) suffer from over-smoothing, while embedding-based models fail to capture abstract relational features. This study aims to overcome these limitations by proposing a unified framework that integrates GNN distillation and abstract probabilistic interaction modeling (APIM). GNN distillation approach introduces an iterative message-feature filtering process to mitigate over-smoothing, preserving the discriminative power of node representations. APIM module complements this by learning structured, abstract interaction patterns through probabilistic signatures and transition matrices, allowing for a richer, more flexible representation of entity and relation interactions. We apply these methods to GNN-based models and the APIM to embedding-based KGC models, conducting extensive evaluations on the widely used WN18RR and FB15K-237 datasets. Our results demonstrate significant performance gains over baseline models, showcasing the effectiveness of the proposed techniques. The findings highlight the importance of both controlling information propagation and leveraging structured probabilistic modeling, offering new avenues for advancing knowledge graph completion. And our codes are available at https://anonymous.4open.science/r/APIM_and_GNN-Distillation-461C.
Similar Papers
Knowledge Graph Enhanced Generative Multi-modal Models for Class-Incremental Learning
CV and Pattern Recognition
Keeps computer vision smart on old and new tasks.
Unlocking Advanced Graph Machine Learning Insights through Knowledge Completion on Neo4j Graph Database
Databases
Finds hidden connections in data for better computer learning.
Enhancing Transformer with GNN Structural Knowledge via Distillation: A Novel Approach
Machine Learning (CS)
Teaches computers to understand complex connections better.