TACT: Humanoid Whole-body Contact Manipulation through Deep Imitation Learning with Tactile Modality
By: Masaki Murooka , Takahiro Hoshi , Kensuke Fukumitsu and more
Potential Business Impact:
Robot learns to grab things by feeling them.
Manipulation with whole-body contact by humanoid robots offers distinct advantages, including enhanced stability and reduced load. On the other hand, we need to address challenges such as the increased computational cost of motion generation and the difficulty of measuring broad-area contact. We therefore have developed a humanoid control system that allows a humanoid robot equipped with tactile sensors on its upper body to learn a policy for whole-body manipulation through imitation learning based on human teleoperation data. This policy, named tactile-modality extended ACT (TACT), has a feature to take multiple sensor modalities as input, including joint position, vision, and tactile measurements. Furthermore, by integrating this policy with retargeting and locomotion control based on a biped model, we demonstrate that the life-size humanoid robot RHP7 Kaleido is capable of achieving whole-body contact manipulation while maintaining balance and walking. Through detailed experimental verification, we show that inputting both vision and tactile modalities into the policy contributes to improving the robustness of manipulation involving broad and delicate contact.
Similar Papers
A Humanoid Visual-Tactile-Action Dataset for Contact-Rich Manipulation
Robotics
Robots learn to touch and grab soft things.
On the Importance of Tactile Sensing for Imitation Learning: A Case Study on Robotic Match Lighting
Robotics
Robots learn to do tricky jobs by feeling.
Vi-TacMan: Articulated Object Manipulation via Vision and Touch
Robotics
Robots use eyes and touch to grab anything.