UniFucGrasp: Human-Hand-Inspired Unified Functional Grasp Annotation Strategy and Dataset for Diverse Dexterous Hands
By: Haoran Lin , Wenrui Chen , Xianchi Chen and more
Potential Business Impact:
Teaches robots to grab things like humans do.
Dexterous grasp datasets are vital for embodied intelligence, but mostly emphasize grasp stability, ignoring functional grasps needed for tasks like opening bottle caps or holding cup handles. Most rely on bulky, costly, and hard-to-control high-DOF Shadow Hands. Inspired by the human hand's underactuated mechanism, we establish UniFucGrasp, a universal functional grasp annotation strategy and dataset for multiple dexterous hand types. Based on biomimicry, it maps natural human motions to diverse hand structures and uses geometry-based force closure to ensure functional, stable, human-like grasps. This method supports low-cost, efficient collection of diverse, high-quality functional grasps. Finally, we establish the first multi-hand functional grasp dataset and provide a synthesis model to validate its effectiveness. Experiments on the UFG dataset, IsaacSim, and complex robotic tasks show that our method improves functional manipulation accuracy and grasp stability, enables efficient generalization across diverse robotic hands, and overcomes annotation cost and generalization challenges in dexterous grasping. The project page is at https://haochen611.github.io/UFG.
Similar Papers
ScaleADFG: Affordance-based Dexterous Functional Grasping via Scalable Dataset
Robotics
Robots can now grab objects of any size.
Web2Grasp: Learning Functional Grasps from Web Images of Hand-Object Interactions
CV and Pattern Recognition
Teaches robots to grab things like people.
Universal Dexterous Functional Grasping via Demonstration-Editing Reinforcement Learning
Robotics
Robots learn to grab any object for any task.