Real-Time Detection of Robot Failures Using Gaze Dynamics in Collaborative Tasks
By: Ramtin Tabatabaei, Vassilis Kostakos, Wafa Johal
Potential Business Impact:
Watches your eyes to spot robot mistakes.
Detecting robot failures during collaborative tasks is crucial for maintaining trust in human-robot interactions. This study investigates user gaze behaviour as an indicator of robot failures, utilising machine learning models to distinguish between non-failure and two types of failures: executional and decisional. Eye-tracking data were collected from 26 participants collaborating with a robot on Tangram puzzle-solving tasks. Gaze metrics, such as average gaze shift rates and the probability of gazing at specific areas of interest, were used to train machine learning classifiers, including Random Forest, AdaBoost, XGBoost, SVM, and CatBoost. The results show that Random Forest achieved 90% accuracy for detecting executional failures and 80% for decisional failures using the first 5 seconds of failure data. Real-time failure detection was evaluated by segmenting gaze data into intervals of 3, 5, and 10 seconds. These findings highlight the potential of gaze dynamics for real-time error detection in human-robot collaboration.
Similar Papers
Adapting Robot's Explanation for Failures Based on Observed Human Behavior in Human-Robot Collaboration
Robotics
Robots learn to explain mistakes better to people.
Using Physiological Measures, Gaze, and Facial Expressions to Model Human Trust in a Robot Partner
Robotics
Helps robots know when people trust them.
Impact of Gaze-Based Interaction and Augmentation on Human-Robot Collaboration in Critical Tasks
Robotics
Helps robots find people faster using eye movements.