Score: 1

Knowledge Graph Completion by Intermediate Variables Regularization

Published: June 3, 2025 | arXiv ID: 2506.02749v1

By: Changyi Xiao, Yixin Cao

Potential Business Impact:

Helps computers learn facts better, avoiding mistakes.

Business Areas:
A/B Testing Data and Analytics

Knowledge graph completion (KGC) can be framed as a 3-order binary tensor completion task. Tensor decomposition-based (TDB) models have demonstrated strong performance in KGC. In this paper, we provide a summary of existing TDB models and derive a general form for them, serving as a foundation for further exploration of TDB models. Despite the expressiveness of TDB models, they are prone to overfitting. Existing regularization methods merely minimize the norms of embeddings to regularize the model, leading to suboptimal performance. Therefore, we propose a novel regularization method for TDB models that addresses this limitation. The regularization is applicable to most TDB models and ensures tractable computation. Our method minimizes the norms of intermediate variables involved in the different ways of computing the predicted tensor. To support our regularization method, we provide a theoretical analysis that proves its effect in promoting low trace norm of the predicted tensor to reduce overfitting. Finally, we conduct experiments to verify the effectiveness of our regularization technique as well as the reliability of our theoretical analysis. The code is available at https://github.com/changyi7231/IVR.

Country of Origin
🇨🇳 China

Repos / Data Links

Page Count
22 pages

Category
Computer Science:
Machine Learning (CS)