Score: 0

Enhancing Recommender Systems Using Textual Embeddings from Pre-trained Language Models

Published: March 24, 2025 | arXiv ID: 2504.08746v1

By: Ngoc Luyen Le, Marie-Hélène Abel

Potential Business Impact:

Makes movie suggestions understand what you like.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Recent advancements in language models and pre-trained language models like BERT and RoBERTa have revolutionized natural language processing, enabling a deeper understanding of human-like language. In this paper, we explore enhancing recommender systems using textual embeddings from pre-trained language models to address the limitations of traditional recommender systems that rely solely on explicit features from users, items, and user-item interactions. By transforming structured data into natural language representations, we generate high-dimensional embeddings that capture deeper semantic relationships between users, items, and contexts. Our experiments demonstrate that this approach significantly improves recommendation accuracy and relevance, resulting in more personalized and context-aware recommendations. The findings underscore the potential of PLMs to enhance the effectiveness of recommender systems.

Page Count
10 pages

Category
Computer Science:
Information Retrieval