DexGrasp Anything: Towards Universal Robotic Dexterous Grasping with Physics Awareness
By: Yiming Zhong , Qi Jiang , Jingyi Yu and more
Potential Business Impact:
Robots can now grab any object they touch.
A dexterous hand capable of grasping any object is essential for the development of general-purpose embodied intelligent robots. However, due to the high degree of freedom in dexterous hands and the vast diversity of objects, generating high-quality, usable grasping poses in a robust manner is a significant challenge. In this paper, we introduce DexGrasp Anything, a method that effectively integrates physical constraints into both the training and sampling phases of a diffusion-based generative model, achieving state-of-the-art performance across nearly all open datasets. Additionally, we present a new dexterous grasping dataset containing over 3.4 million diverse grasping poses for more than 15k different objects, demonstrating its potential to advance universal dexterous grasping. The code of our method and our dataset will be publicly released soon.
Similar Papers
RobustDexGrasp: Robust Dexterous Grasping of General Objects
Robotics
Robots learn to grab anything, even when pushed.
ZeroDexGrasp: Zero-Shot Task-Oriented Dexterous Grasp Synthesis with Prompt-Based Multi-Stage Semantic Reasoning
Robotics
Robots learn to grab things for any job.
G-DexGrasp: Generalizable Dexterous Grasping Synthesis Via Part-Aware Prior Retrieval and Prior-Assisted Generation
CV and Pattern Recognition
Robots learn to grab new things they've never seen.