Bring Your Own Grasp Generator: Leveraging Robot Grasp Generation for Prosthetic Grasping
By: Giuseppe Stracquadanio , Federico Vasile , Elisa Maiettini and more
Potential Business Impact:
Lets prosthetic hands grab things faster and easier.
One of the most important research challenges in upper-limb prosthetics is enhancing the user-prosthesis communication to closely resemble the experience of a natural limb. As prosthetic devices become more complex, users often struggle to control the additional degrees of freedom. In this context, leveraging shared-autonomy principles can significantly improve the usability of these systems. In this paper, we present a novel eye-in-hand prosthetic grasping system that follows these principles. Our system initiates the approach-to-grasp action based on user's command and automatically configures the DoFs of a prosthetic hand. First, it reconstructs the 3D geometry of the target object without the need of a depth camera. Then, it tracks the hand motion during the approach-to-grasp action and finally selects a candidate grasp configuration according to user's intentions. We deploy our system on the Hannes prosthetic hand and test it on able-bodied subjects and amputees to validate its effectiveness. We compare it with a multi-DoF prosthetic control baseline and find that our method enables faster grasps, while simplifying the user experience. Code and demo videos are available online at https://hsp-iit.github.io/byogg/.
Similar Papers
Grasp-HGN: Grasping the Unexpected
Robotics
Robotic hands better grab new things.
Vision-Guided Grasp Planning for Prosthetic Hands in Unstructured Environments
Robotics
Lets prosthetic hands grab things like real hands.
Towards Biosignals-Free Autonomous Prosthetic Hand Control via Imitation Learning
Robotics
Prosthetic hand grabs objects by itself.