MIHRaGe: A Mixed-Reality Interface for Human-Robot Interaction via Gaze-Oriented Control
By: Rafael R. Baptista , Nina R. Gerszberg , Ricardo V. Godoy and more
Potential Business Impact:
Lets people control robots with their eyes.
Individuals with upper limb mobility impairments often require assistive technologies to perform activities of daily living. While gaze-tracking has emerged as a promising method for robotic assistance, existing solutions lack sufficient feedback mechanisms, leading to uncertainty in user intent recognition and reduced adaptability. This paper presents the MIHRAGe interface, an integrated system that combines gaze-tracking, robotic assistance, and a mixed-reality to create an immersive environment for controlling the robot using only eye movements. The system was evaluated through an experimental protocol involving four participants, assessing gaze accuracy, robotic positioning precision, and the overall success of a pick and place task. Results showed an average gaze fixation error of 1.46 cm, with individual variations ranging from 1.28 cm to 2.14 cm. The robotic arm demonstrated an average positioning error of +-1.53 cm, with discrepancies attributed to interface resolution and calibration constraints. In a pick and place task, the system achieved a success rate of 80%, highlighting its potential for improving accessibility in human-robot interaction with visual feedback to the user.
Similar Papers
RaycastGrasp: Eye-Gaze Interaction with Wearable Devices for Robotic Manipulation
Robotics
Lets robots grab things by looking at them.
HAGI++: Head-Assisted Gaze Imputation and Generation
Human-Computer Interaction
Fixes broken eye-tracking data using head movement.
MIRAGE: Multimodal Intention Recognition and Admittance-Guided Enhancement in VR-based Multi-object Teleoperation
Robotics
Helps robots grab things better in virtual worlds.