Concert Interaction Translation: Augmenting VR Live Concert Experience using Chat-Driven Artificial Collective Reactions
By: Sebin Lee, Yeonho Cho, Jungjin Lee
Potential Business Impact:
Makes VR concerts feel more alive with real fan reactions.
Computer-mediated concerts can be enjoyed on various devices, from desktop and mobile to VR devices, often supporting multiple devices simultaneously. However, due to the limited accessibility of VR devices, relatively small audience members tend to congregate in VR venues, resulting in diminished unique social experiences. To address this gap and enrich VR concert experiences, we present a novel approach that leverages non-VR user interaction data, specifically chat from audiences watching the same content on a live-streaming platform. Based on an analysis of audience reactions in offline concerts, we designed and prototyped a concert interaction translation system that extracts the level of engagement and emotions from chats and translates them to collective movements, cheers, and singalongs of virtual audience avatars in a VR venue. Our user study (n=48) demonstrates that our system, which combines both movement and audio reactions, significantly enhances the sense of immersion and co-presence than the previous method.
Similar Papers
Audience Amplified: Virtual Audiences in Asynchronously Performed AR Theater
Human-Computer Interaction
Makes solo AR shows feel like a live concert.
ChatAR: Conversation Support using Large Language Model and Augmented Reality
Human-Computer Interaction
Helps you talk better by showing hidden info.
Practicing a Second Language Without Fear: Mixed Reality Agents for Interactive Group Conversation
Human-Computer Interaction
Lets you practice talking in groups safely.