RecMind: LLM-Enhanced Graph Neural Networks for Personalized Consumer Recommendations
By: Chang Xue , Youwei Lu , Chen Yang and more
Potential Business Impact:
Suggests better things you might like.
Personalization is a core capability across consumer technologies, streaming, shopping, wearables, and voice, yet it remains challenged by sparse interactions, fast content churn, and heterogeneous textual signals. We present RecMind, an LLM-enhanced graph recommender that treats the language model as a preference prior rather than a monolithic ranker. A frozen LLM equipped with lightweight adapters produces text-conditioned user/item embeddings from titles, attributes, and reviews; a LightGCN backbone learns collaborative embeddings from the user-item graph. We align the two views with a symmetric contrastive objective and fuse them via intra-layer gating, allowing language to dominate in cold/long-tail regimes and graph structure to stabilize rankings elsewhere. On Yelp and Amazon-Electronics, RecMind attains the best results on all eight reported metrics, with relative improvements up to +4.53\% (Recall@40) and +4.01\% (NDCG@40) over strong baselines. Ablations confirm both the necessity of cross-view alignment and the advantage of gating over late fusion and LLM-only variants.
Similar Papers
End-to-End Personalization: Unifying Recommender Systems with Large Language Models
Information Retrieval
Suggests movies you'll love, explains why.
Research on Personalized Financial Product Recommendation by Integrating Large Language Models and Graph Neural Networks
Information Retrieval
Suggests best money products for you.
MR.Rec: Synergizing Memory and Reasoning for Personalized Recommendation Assistant with LLMs
Information Retrieval
Helps websites guess what you want to buy.