Score: 1

RecBase: Generative Foundation Model Pretraining for Zero-Shot Recommendation

Published: September 3, 2025 | arXiv ID: 2509.03131v1

By: Sashuai Zhou , Weinan Gan , Qijiong Liu and more

Potential Business Impact:

Recommends items better across different apps.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Recent advances in LLM-based recommendation have shown promise, yet their cross-domain generalization is hindered by a fundamental mismatch between language-centric pretraining and the recommendation task. Existing methods, relying on language-level knowledge, fail to capture dynamic, item-level user interests across domains. To bridge this gap, we propose RecBase, a domain-agnostic foundational model pretrained with a recommendation-oriented objective. RecBase leverages a large-scale, heterogeneous, cross-domain corpus with unified textual representations and feature mappings to enhance cross-domain generalization. To further align item semantics across domains, we introduce a unified item tokenizer that encodes items into hierarchical concept identifiers, enabling structured representation and efficient vocabulary sharing. The model is trained using an autoregressive objective to capture complex item-level sequential patterns. On eight real-world datasets, our 1.5B-parameter model matches or surpasses the performance of LLM baselines up to 7B parameters in zero-shot and cross-domain recommendation tasks.

Country of Origin
🇨🇳 China

Repos / Data Links

Page Count
13 pages

Category
Computer Science:
Information Retrieval