Inferring Operator Emotions from a Motion-Controlled Robotic Arm
By: Xinyu Qi , Zeyu Deng , Shaun Alexander Macdonald and more
Potential Business Impact:
Robot movements show how the operator feels.
A remote robot operator's affective state can significantly impact the resulting robot's motions leading to unexpected consequences, even when the user follows protocol and performs permitted tasks. The recognition of a user operator's affective states in remote robot control scenarios is, however, underexplored. Current emotion recognition methods rely on reading the user's vital signs or body language, but the devices and user participation these measures require would add limitations to remote robot control. We demonstrate that the functional movements of a remote-controlled robotic avatar, which was not designed for emotional expression, can be used to infer the emotional state of the human operator via a machine-learning system. Specifically, our system achieved 83.3$\%$ accuracy in recognizing the user's emotional state expressed by robot movements, as a result of their hand motions. We discuss the implications of this system on prominent current and future remote robot operation and affective robotic contexts.
Similar Papers
Generation of Real-time Robotic Emotional Expressions Learning from Human Demonstration in Mixed Reality
Robotics
Robots show feelings like humans do.
Awakening Facial Emotional Expressions in Human-Robot
Robotics
Robots learn to make human-like faces.
EmoACT: a Framework to Embed Emotions into Artificial Agents Based on Affect Control Theory
Robotics
Robots show feelings to act more human.