Toward Artificial Palpation: Representation Learning of Touch on Soft Bodies
By: Zohar Rimon , Elisei Shafer , Tal Tepper and more
Potential Business Impact:
Robot doctors feel and see inside bodies.
Palpation, the use of touch in medical examination, is almost exclusively performed by humans. We investigate a proof of concept for an artificial palpation method based on self-supervised learning. Our key idea is that an encoder-decoder framework can learn a $\textit{representation}$ from a sequence of tactile measurements that contains all the relevant information about the palpated object. We conjecture that such a representation can be used for downstream tasks such as tactile imaging and change detection. With enough training data, it should capture intricate patterns in the tactile measurements that go beyond a simple map of forces -- the current state of the art. To validate our approach, we both develop a simulation environment and collect a real-world dataset of soft objects and corresponding ground truth images obtained by magnetic resonance imaging (MRI). We collect palpation sequences using a robot equipped with a tactile sensor, and train a model that predicts sensory readings at different positions on the object. We investigate the representation learned in this process, and demonstrate its use in imaging and change detection.
Similar Papers
Embodied Tactile Perception of Soft Objects Properties
Robotics
Robots learn to feel and understand objects better.
SARL: Spatially-Aware Self-Supervised Representation Learning for Visuo-Tactile Perception
CV and Pattern Recognition
Helps robots learn to grasp objects better.
Improving Robotic Manipulation: Techniques for Object Pose Estimation, Accommodating Positional Uncertainty, and Disassembly Tasks from Examples
Robotics
Robots feel objects to grab them better.