Score: 0

FreeTacMan: Robot-free Visuo-Tactile Data Collection System for Contact-rich Manipulation

Published: June 2, 2025 | arXiv ID: 2506.01941v1

By: Longyan Wu , Checheng Yu , Jieji Ren and more

Potential Business Impact:

Lets robots learn to grab things by feeling them.

Business Areas:
Robotics Hardware, Science and Engineering, Software

Enabling robots with contact-rich manipulation remains a pivotal challenge in robot learning, which is substantially hindered by the data collection gap, including its inefficiency and limited sensor setup. While prior work has explored handheld paradigms, their rod-based mechanical structures remain rigid and unintuitive, providing limited tactile feedback and posing challenges for human operators. Motivated by the dexterity and force feedback of human motion, we propose FreeTacMan, a human-centric and robot-free data collection system for accurate and efficient robot manipulation. Concretely, we design a wearable data collection device with dual visuo-tactile grippers, which can be worn by human fingers for intuitive and natural control. A high-precision optical tracking system is introduced to capture end-effector poses, while synchronizing visual and tactile feedback simultaneously. FreeTacMan achieves multiple improvements in data collection performance compared to prior works, and enables effective policy learning for contact-rich manipulation tasks with the help of the visuo-tactile information. We will release the work to facilitate reproducibility and accelerate research in visuo-tactile manipulation.

Page Count
18 pages

Category
Computer Science:
Robotics