Learning in Context: Personalizing Educational Content with Large Language Models to Enhance Student Learning
By: Joy Jia Yin Lim , Daniel Zhang-Li , Jifan Yu and more
Potential Business Impact:
Makes learning fun by changing lessons for you.
Standardized, one-size-fits-all educational content often fails to connect with students' individual backgrounds and interests, leading to disengagement and a perceived lack of relevance. To address this challenge, we introduce PAGE, a novel framework that leverages large language models (LLMs) to automatically personalize educational materials by adapting them to each student's unique context, such as their major and personal interests. To validate our approach, we deployed PAGE in a semester-long intelligent tutoring system and conducted a user study to evaluate its impact in an authentic educational setting. Our findings show that students who received personalized content demonstrated significantly improved learning outcomes and reported higher levels of engagement, perceived relevance, and trust compared to those who used standardized materials. This work demonstrates the practical value of LLM-powered personalization and offers key design implications for creating more effective, engaging, and trustworthy educational experiences.
Similar Papers
Cultivating Helpful, Personalized, and Creative AI Tutors: A Framework for Pedagogical Alignment using Reinforcement Learning
Machine Learning (CS)
Teaches AI to be a better, personalized tutor.
LLM-Driven Personalized Answer Generation and Evaluation
Computers and Society
Helps online students get answers just for them.
Personalized and Constructive Feedback for Computer Science Students Using the Large Language Model (LLM)
Computers and Society
Gives students personalized feedback to learn better.