Hand Gesture Recognition for Collaborative Robots Using Lightweight Deep Learning in Real-Time Robotic Systems
By: Muhtadin , I Wayan Agus Darmawan , Muhammad Hilmi Rusydiansyah and more
Potential Business Impact:
Control robots with your hand gestures.
Direct and natural interaction is essential for intuitive human-robot collaboration, eliminating the need for additional devices such as joysticks, tablets, or wearable sensors. In this paper, we present a lightweight deep learning-based hand gesture recognition system that enables humans to control collaborative robots naturally and efficiently. This model recognizes eight distinct hand gestures with only 1,103 parameters and a compact size of 22 KB, achieving an accuracy of 93.5%. To further optimize the model for real-world deployment on edge devices, we applied quantization and pruning using TensorFlow Lite, reducing the final model size to just 7 KB. The system was successfully implemented and tested on a Universal Robot UR5 collaborative robot within a real-time robotic framework based on ROS2. The results demonstrate that even extremely lightweight models can deliver accurate and responsive hand gesture-based control for collaborative robots, opening new possibilities for natural human-robot interaction in constrained environments.
Similar Papers
Tactile Gesture Recognition with Built-in Joint Sensors for Industrial Robots
Robotics
Robots learn to understand your hand movements.
Accessible Gesture-Driven Augmented Reality Interaction System
Human-Computer Interaction
Lets people with weak hands control games with gestures.
Real-Time Sign Language Gestures to Speech Transcription using Deep Learning
CV and Pattern Recognition
Translates sign language into speech instantly.