The impact of tactile sensor configurations on grasp learning efficiency -- a comparative evaluation in simulation
By: Eszter Birtalan, Miklós Koller
Tactile sensors are breaking into the field of robotics to provide direct information related to contact surfaces, including contact events, slip events and even texture identification. These events are especially important for robotic hand designs, including prosthetics, as they can greatly improve grasp stability. Most presently published robotic hand designs, however, implement them in vastly different densities and layouts on the hand surface, often reserving the majority of the available space. We used simulations to evaluate 6 different tactile sensor configurations with different densities and layouts, based on their impact on reinforcement learning. Our two-setup system allows for robust results that are not dependent on the use of a given physics simulator, robotic hand model or machine learning algorithm. Our results show setup-specific, as well as generalized effects across the 6 sensorized simulations, and we identify one configuration as consistently yielding the best performance across both setups. These results could help future research aimed at robotic hand designs, including prostheses.
Similar Papers
The Role of Tactile Sensing for Learning Reach and Grasp
Robotics
Robots learn to grab better with touch, not just sight.
The Role of Touch: Towards Optimal Tactile Sensing Distribution in Anthropomorphic Hands for Dexterous In-Hand Manipulation
Robotics
Helps robots grip and move objects better.
Simultaneous Extrinsic Contact and In-Hand Pose Estimation via Distributed Tactile Sensing
Robotics
Helps robots feel and see to grab things.