Vision Controlled Orthotic Hand Exoskeleton
By: Connor Blais, Md Abdul Baset Sarker, Masudul H. Imtiaz
Potential Business Impact:
Helps weak hands grip and move objects.
This paper presents the design and implementation of an AI vision-controlled orthotic hand exoskeleton to enhance rehabilitation and assistive functionality for individuals with hand mobility impairments. The system leverages a Google Coral Dev Board Micro with an Edge TPU to enable real-time object detection using a customized MobileNet\_V2 model trained on a six-class dataset. The exoskeleton autonomously detects objects, estimates proximity, and triggers pneumatic actuation for grasp-and-release tasks, eliminating the need for user-specific calibration needed in traditional EMG-based systems. The design prioritizes compactness, featuring an internal battery. It achieves an 8-hour runtime with a 1300 mAh battery. Experimental results demonstrate a 51ms inference speed, a significant improvement over prior iterations, though challenges persist in model robustness under varying lighting conditions and object orientations. While the most recent YOLO model (YOLOv11) showed potential with 15.4 FPS performance, quantization issues hindered deployment. The prototype underscores the viability of vision-controlled exoskeletons for real-world assistive applications, balancing portability, efficiency, and real-time responsiveness, while highlighting future directions for model optimization and hardware miniaturization.
Similar Papers
Point Cloud-based Grasping for Soft Hand Exoskeleton
Robotics
Helps robots grasp objects better using sight.
A Vision-Enabled Prosthetic Hand for Children with Upper Limb Disabilities
Robotics
Helps kids with missing hands grasp things better.
Imitation Learning for Adaptive Control of a Virtual Soft Exoglove
Robotics
Robots help hands with weak muscles move better.