Object Recognition and Force Estimation with the GelSight Baby Fin Ray
By: Sandra Q. Liu, Yuxiang Ma, Edward H. Adelson
Potential Business Impact:
Robotic fingers feel and sort nuts by touch.
Recent advances in soft robotic hands and tactile sensing have enabled both to perform an increasing number of complex tasks with the aid of machine learning. In particular, we presented the GelSight Baby Fin Ray in our previous work, which integrates a camera with a soft, compliant Fin Ray structure. Camera-based tactile sensing gives the GelSight Baby Fin Ray the ability to capture rich contact information like forces, object geometries, and textures. Moreover, our previous work showed that the GelSight Baby Fin Ray can dig through clutter, and classify in-shell nuts. To further examine the potential of the GelSight Baby Fin Ray, we leverage learning to distinguish nut-in-shell textures and to perform force and position estimation. We implement ablation studies with popular neural network structures, including ResNet50, GoogLeNet, and 3- and 5-layer convolutional neural network (CNN) structures. We conclude that machine learning is a promising technique to extract useful information from high-resolution tactile images and empower soft robotics to better understand and interact with the environments.
Similar Papers
AllTact Fin Ray: A Compliant Robot Gripper with Omni-Directional Tactile Sensing
Robotics
Lets robots feel and grab things better.
TacFinRay: Soft Tactile Fin-Ray Finger with Indirect Tactile Sensing for Robust Grasping
Robotics
Robot fingers feel where and how hard they touch.
Multi-Objective Neural Network Assisted Design Optimization of Soft Fin-Ray Grippers for Enhanced Grasping Performance
Robotics
Makes robot hands grip gently or strongly.