Knowledge Graph Completion by Intermediate Variables Regularization
By: Changyi Xiao, Yixin Cao
Potential Business Impact:
Helps computers learn facts better, avoiding mistakes.
Knowledge graph completion (KGC) can be framed as a 3-order binary tensor completion task. Tensor decomposition-based (TDB) models have demonstrated strong performance in KGC. In this paper, we provide a summary of existing TDB models and derive a general form for them, serving as a foundation for further exploration of TDB models. Despite the expressiveness of TDB models, they are prone to overfitting. Existing regularization methods merely minimize the norms of embeddings to regularize the model, leading to suboptimal performance. Therefore, we propose a novel regularization method for TDB models that addresses this limitation. The regularization is applicable to most TDB models and ensures tractable computation. Our method minimizes the norms of intermediate variables involved in the different ways of computing the predicted tensor. To support our regularization method, we provide a theoretical analysis that proves its effect in promoting low trace norm of the predicted tensor to reduce overfitting. Finally, we conduct experiments to verify the effectiveness of our regularization technique as well as the reliability of our theoretical analysis. The code is available at https://github.com/changyi7231/IVR.
Similar Papers
Knowledge Graph Completion with Mixed Geometry Tensor Factorization
Machine Learning (CS)
Makes computers understand facts better.
DrKGC: Dynamic Subgraph Retrieval-Augmented LLMs for Knowledge Graph Completion across General and Biomedical Domains
Artificial Intelligence
Helps computers understand facts by seeing connections.
Unlocking Advanced Graph Machine Learning Insights through Knowledge Completion on Neo4j Graph Database
Databases
Finds hidden connections in data for better computer learning.