Investigating AI in Peer Support via Multi-Module System-Driven Embodied Conversational Agents
By: Ruoyu Wen , Xiaoli Wu , Kunal Gupta and more
Potential Business Impact:
Helps AI understand feelings for better mental support.
Young people's mental well-being is a global concern, with peer support playing a key role in daily emotional regulation. Conversational agents are increasingly viewed as promising tools for delivering accessible, personalised peer support, particularly where professional counselling is limited. However, existing systems often suffer from rigid input formats, scripted responses, and limited emotional sensitivity. The emergence of large language models introduces new possibilities for generating flexible, context-aware, and empathetic responses. To explore how individuals with psychological training perceive such systems in peer support contexts, we developed an LLM-based multi-module system to drive embodied conversational agents informed by Cognitive Behavioral Therapy (CBT). In a user study (N=10), we qualitatively examined participants' perceptions, focusing on trust, response quality, workflow integration, and design opportunities for future mental well-being support systems.
Similar Papers
Decoding Student Minds: Leveraging Conversational Agents for Psychological and Learning Analysis
Computation and Language
Helps students learn better by understanding feelings.
"Is This Really a Human Peer Supporter?": Misalignments Between Peer Supporters and Experts in LLM-Supported Interactions
Human-Computer Interaction
AI helps train better mental health helpers.
Customizing Emotional Support: How Do Individuals Construct and Interact With LLM-Powered Chatbots
Human-Computer Interaction
Builds AI friends for emotional support.