Lightweight Inference-Time Personalization for Frozen Knowledge Graph Embeddings
By: Ozan Oguztuzun, Cerag Oguztuzun
Foundation models for knowledge graphs (KGs) achieve strong cohort-level performance in link prediction, yet fail to capture individual user preferences; a key disconnect between general relational reasoning and personalized ranking. We propose GatedBias, a lightweight inference-time personalization framework that adapts frozen KG embeddings to individual user contexts without retraining or compromising global accuracy. Our approach introduces structure-gated adaptation: profile-specific features combine with graph-derived binary gates to produce interpretable, per-entity biases, requiring only ${\sim}300$ trainable parameters. We evaluate GatedBias on two benchmark datasets (Amazon-Book and Last-FM), demonstrating statistically significant improvements in alignment metrics while preserving cohort performance. Counterfactual perturbation experiments validate causal responsiveness; entities benefiting from specific preference signals show 6--30$\times$ greater rank improvements when those signals are boosted. These results show that personalized adaptation of foundation models can be both parameter-efficient and causally verifiable, bridging general knowledge representations with individual user needs.
Similar Papers
Avoiding Over-Personalization with Rule-Guided Knowledge Graph Adaptation for LLM Recommendations
Information Retrieval
Shows you more interesting things online.
Vectorized Context-Aware Embeddings for GAT-Based Collaborative Filtering
Information Retrieval
Helps apps suggest movies for new users.
KGIF: Optimizing Relation-Aware Recommendations with Knowledge Graph Information Fusion
Machine Learning (CS)
Shows why it suggests things you might like.