Towards Practical Large-scale Dynamical Heterogeneous Graph Embedding: Cold-start Resilient Recommendation
By: Mabiao Long , Jiaxi Liu , Yufeng Li and more
Deploying dynamic heterogeneous graph embeddings in production faces key challenges of scalability, data freshness, and cold-start. This paper introduces a practical, two-stage solution that balances deep graph representation with low-latency incremental updates. Our framework combines HetSGFormer, a scalable graph transformer for static learning, with Incremental Locally Linear Embedding (ILLE), a lightweight, CPU-based algorithm for real-time updates. HetSGFormer captures global structure with linear scalability, while ILLE provides rapid, targeted updates to incorporate new data, thus avoiding costly full retraining. This dual approach is cold-start resilient, leveraging the graph to create meaningful embeddings from sparse data. On billion-scale graphs, A/B tests show HetSGFormer achieved up to a 6.11% lift in Advertiser Value over previous methods, while the ILLE module added another 3.22% lift and improved embedding refresh timeliness by 83.2%. Our work provides a validated framework for deploying dynamic graph learning in production environments.
Similar Papers
Language Embedding Meets Dynamic Graph: A New Exploration for Neural Architecture Representation Learning
Machine Learning (CS)
Helps computers choose the best parts for speed.
Towards Efficient LLM-aware Heterogeneous Graph Learning
Computation and Language
Makes AI understand complex connections faster.
A Model-agnostic Strategy to Mitigate Embedding Degradation in Personalized Federated Recommendation
Information Retrieval
Keeps your movie picks private and better.