Tell-XR: Conversational End-User Development of XR Automations
By: Alessandro Carcangiu , Marco Manca , Jacopo Mereu and more
Potential Business Impact:
Lets anyone build virtual worlds with just talking.
The availability of extended reality (XR) devices has widened their adoption, yet authoring interactive experiences remains complex for non-programmers. We introduce Tell-XR, an intelligent agent leveraging large language models (LLMs) to guide end-users in defining the interaction in XR settings using automations described as Event-Condition-Action (ECA) rules. Through a formative study, we identified the key conversation stages to define and refine automations, which informed the design of the system architecture. The evaluation study in two scenarios (a VR museum and an AR smart home) demonstrates the effectiveness of Tell-XR across different XR interaction settings.
Similar Papers
Evaluating Social Acceptance of eXtended Reality (XR) Agent Technology: A User Study (Extended Version)
Human-Computer Interaction
Teaches journalists safely using virtual people.
Exploring User Acceptance and Concerns toward LLM-powered Conversational Agents in Immersive Extended Reality
Human-Computer Interaction
Makes virtual worlds safer for talking.
XR Blocks: Accelerating Human-centered AI + XR Innovation
Human-Computer Interaction
Builds AI virtual worlds faster.