Score: 0

EncodeRec: An Embedding Backbone for Recommendation Systems

Published: January 15, 2026 | arXiv ID: 2601.10837v1

By: Guy Hadad, Neomi Rabaev, Bracha Shapira

Potential Business Impact:

Makes online shopping suggestions much better.

Business Areas:
Semantic Search Internet Services

Recent recommender systems increasingly leverage embeddings from large pre-trained language models (PLMs). However, such embeddings exhibit two key limitations: (1) PLMs are not explicitly optimized to produce structured and discriminative embedding spaces, and (2) their representations remain overly generic, often failing to capture the domain-specific semantics crucial for recommendation tasks. We present EncodeRec, an approach designed to align textual representations with recommendation objectives while learning compact, informative embeddings directly from item descriptions. EncodeRec keeps the language model parameters frozen during recommender system training, making it computationally efficient without sacrificing semantic fidelity. Experiments across core recommendation benchmarks demonstrate its effectiveness both as a backbone for sequential recommendation models and for semantic ID tokenization, showing substantial gains over PLM-based and embedding model baselines. These results underscore the pivotal role of embedding adaptation in bridging the gap between general-purpose language models and practical recommender systems.

Country of Origin
🇮🇱 Israel

Page Count
4 pages

Category
Computer Science:
Computation and Language