FreeTacMan: Robot-free Visuo-Tactile Data Collection System for Contact-rich Manipulation
By: Longyan Wu , Checheng Yu , Jieji Ren and more
Potential Business Impact:
Lets robots learn to grab things by feeling them.
Enabling robots with contact-rich manipulation remains a pivotal challenge in robot learning, which is substantially hindered by the data collection gap, including its inefficiency and limited sensor setup. While prior work has explored handheld paradigms, their rod-based mechanical structures remain rigid and unintuitive, providing limited tactile feedback and posing challenges for human operators. Motivated by the dexterity and force feedback of human motion, we propose FreeTacMan, a human-centric and robot-free data collection system for accurate and efficient robot manipulation. Concretely, we design a wearable data collection device with dual visuo-tactile grippers, which can be worn by human fingers for intuitive and natural control. A high-precision optical tracking system is introduced to capture end-effector poses, while synchronizing visual and tactile feedback simultaneously. FreeTacMan achieves multiple improvements in data collection performance compared to prior works, and enables effective policy learning for contact-rich manipulation tasks with the help of the visuo-tactile information. We will release the work to facilitate reproducibility and accelerate research in visuo-tactile manipulation.
Similar Papers
ViTaMIn: Learning Contact-Rich Tasks Through Robot-Free Visuo-Tactile Manipulation Interface
Robotics
Teaches robots to grab things by feeling them.
Vi-TacMan: Articulated Object Manipulation via Vision and Touch
Robotics
Robots use eyes and touch to grab anything.
ViTaMIn-B: A Reliable and Efficient Visuo-Tactile Bimanual Manipulation Interface
Robotics
Helps robots learn to do tricky tasks with their hands.