Score: 0

Learning in Context: Personalizing Educational Content with Large Language Models to Enhance Student Learning

Published: September 18, 2025 | arXiv ID: 2509.15068v1

By: Joy Jia Yin Lim , Daniel Zhang-Li , Jifan Yu and more

Potential Business Impact:

Makes learning fun by changing lessons for you.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Standardized, one-size-fits-all educational content often fails to connect with students' individual backgrounds and interests, leading to disengagement and a perceived lack of relevance. To address this challenge, we introduce PAGE, a novel framework that leverages large language models (LLMs) to automatically personalize educational materials by adapting them to each student's unique context, such as their major and personal interests. To validate our approach, we deployed PAGE in a semester-long intelligent tutoring system and conducted a user study to evaluate its impact in an authentic educational setting. Our findings show that students who received personalized content demonstrated significantly improved learning outcomes and reported higher levels of engagement, perceived relevance, and trust compared to those who used standardized materials. This work demonstrates the practical value of LLM-powered personalization and offers key design implications for creating more effective, engaging, and trustworthy educational experiences.

Country of Origin
🇨🇳 China

Page Count
19 pages

Category
Computer Science:
Human-Computer Interaction