RobustDexGrasp: Robust Dexterous Grasping of General Objects
By: Hui Zhang , Zijian Wu , Linyi Huang and more
Potential Business Impact:
Robots learn to grab anything, even when pushed.
The ability to robustly grasp a variety of objects is essential for dexterous robots. In this paper, we present a framework for zero-shot dynamic dexterous grasping using single-view visual inputs, designed to be resilient to various disturbances. Our approach utilizes a hand-centric object shape representation based on dynamic distance vectors between finger joints and object surfaces. This representation captures the local shape around potential contact regions rather than focusing on detailed global object geometry, thereby enhancing generalization to shape variations and uncertainties. To address perception limitations, we integrate a privileged teacher policy with a mixed curriculum learning approach, allowing the student policy to effectively distill grasping capabilities and explore for adaptation to disturbances. Trained in simulation, our method achieves success rates of 97.0% across 247,786 simulated objects and 94.6% across 512 real objects, demonstrating remarkable generalization. Quantitative and qualitative results validate the robustness of our policy against various disturbances.
Similar Papers
DexGrasp Anything: Towards Universal Robotic Dexterous Grasping with Physics Awareness
CV and Pattern Recognition
Robots can now grab any object they touch.
D3Grasp: Diverse and Deformable Dexterous Grasping for General Objects
Robotics
Robots can now grab soft things better.
ZeroDexGrasp: Zero-Shot Task-Oriented Dexterous Grasp Synthesis with Prompt-Based Multi-Stage Semantic Reasoning
Robotics
Robots learn to grab things for any job.