Self-supervised perception for tactile skin covered dexterous hands
By: Akash Sharma , Carolina Higuera , Chaithanya Krishna Bodduluri and more
Potential Business Impact:
Robots feel with their whole hand better.
We present Sparsh-skin, a pre-trained encoder for magnetic skin sensors distributed across the fingertips, phalanges, and palm of a dexterous robot hand. Magnetic tactile skins offer a flexible form factor for hand-wide coverage with fast response times, in contrast to vision-based tactile sensors that are restricted to the fingertips and limited by bandwidth. Full hand tactile perception is crucial for robot dexterity. However, a lack of general-purpose models, challenges with interpreting magnetic flux and calibration have limited the adoption of these sensors. Sparsh-skin, given a history of kinematic and tactile sensing across a hand, outputs a latent tactile embedding that can be used in any downstream task. The encoder is self-supervised via self-distillation on a variety of unlabeled hand-object interactions using an Allegro hand sensorized with Xela uSkin. In experiments across several benchmark tasks, from state estimation to policy learning, we find that pretrained Sparsh-skin representations are both sample efficient in learning downstream tasks and improve task performance by over 41% compared to prior work and over 56% compared to end-to-end learning.
Similar Papers
Tactile Beyond Pixels: Multisensory Touch Representations for Robot Manipulation
Robotics
Robots feel and understand objects better.
DexSkin: High-Coverage Conformable Robotic Skin for Learning Contact-Rich Manipulation
Robotics
Robots feel like humans to grab things better.
SARL: Spatially-Aware Self-Supervised Representation Learning for Visuo-Tactile Perception
CV and Pattern Recognition
Helps robots learn to grasp objects better.