Score: 1

Leveraging Historical and Current Interests for Continual Sequential Recommendation

Published: June 9, 2025 | arXiv ID: 2506.07466v1

By: Gyuseok Lee , Hyunsik Yoo , Junyoung Hwang and more

Potential Business Impact:

Keeps online shopping suggestions smart over time.

Business Areas:
Semantic Search Internet Services

Sequential recommendation models based on the Transformer architecture show superior performance in harnessing long-range dependencies within user behavior via self-attention. However, naively updating them on continuously arriving non-stationary data streams incurs prohibitive computation costs or leads to catastrophic forgetting. To address this, we propose Continual Sequential Transformer for Recommendation (CSTRec) that effectively leverages well-preserved historical user interests while capturing current interests. At its core is Continual Sequential Attention (CSA), a linear attention mechanism that retains past knowledge without direct access to old data. CSA integrates two key components: (1) Cauchy-Schwarz Normalization that stabilizes training under uneven interaction frequencies, and (2) Collaborative Interest Enrichment that mitigates forgetting through shared, learnable interest pools. We further introduce a technique that facilitates learning for cold-start users by transferring historical knowledge from behaviorally similar existing users. Extensive experiments on three real-world datasets indicate that CSTRec outperforms state-of-the-art baselines in both knowledge retention and acquisition.

Country of Origin
πŸ‡ΊπŸ‡Έ πŸ‡°πŸ‡· United States, Korea, Republic of

Page Count
11 pages

Category
Computer Science:
Information Retrieval