Human-Machine Ritual: Synergic Performance through Real-Time Motion Recognition
By: Zhuodi Cai, Ziyu Xu, Juan Pampin
Potential Business Impact:
Lets dancers control music with their moves.
We introduce a lightweight, real-time motion recognition system that enables synergic human-machine performance through wearable IMU sensor data, MiniRocket time-series classification, and responsive multimedia control. By mapping dancer-specific movement to sound through somatic memory and association, we propose an alternative approach to human-machine collaboration, one that preserves the expressive depth of the performing body while leveraging machine learning for attentive observation and responsiveness. We demonstrate that this human-centered design reliably supports high accuracy classification (<50 ms latency), offering a replicable framework to integrate dance-literate machines into creative, educational, and live performance contexts.
Similar Papers
Towards Immersive Human-X Interaction: A Real-Time Framework for Physically Plausible Motion Synthesis
CV and Pattern Recognition
Makes robots move and react like real people.
Biomechanically consistent real-time action recognition for human-robot interaction
Robotics
Helps robots understand what people are doing.
Coordinated Motion Planning of a Wearable Multi-Limb System for Enhanced Human-Robot Interaction
Robotics
Robotic arms help people move without tiring muscles.