TacMan-Turbo: Proactive Tactile Control for Robust and Efficient Articulated Object Manipulation
By: Zihang Zhao , Zhenghao Qi , Yuyang Li and more
Potential Business Impact:
Robots learn to move objects smoothly and quickly.
Adept manipulation of articulated objects is essential for robots to operate successfully in human environments. Such manipulation requires both effectiveness -- reliable operation despite uncertain object structures -- and efficiency -- swift execution with minimal redundant steps and smooth actions. Existing approaches struggle to achieve both objectives simultaneously: methods relying on predefined kinematic models lack effectiveness when encountering structural variations, while tactile-informed approaches achieve robust manipulation without kinematic priors but compromise efficiency through reactive, step-by-step exploration-compensation cycles. This paper introduces TacMan-Turbo, a novel proactive tactile control framework for articulated object manipulation that resolves this fundamental trade-off. Unlike previous approaches that treat tactile contact deviations merely as error signals requiring compensation, our method interprets these deviations as rich sources of local kinematic information. This new perspective enables our controller to predict optimal future interactions and make proactive adjustments, significantly enhancing manipulation efficiency. In comprehensive evaluations across 200 diverse simulated articulated objects and real-world experiments, our approach maintains a 100% success rate while significantly outperforming the previous tactile-informed method in time efficiency, action efficiency, and trajectory smoothness (all p-values < 0.0001). These results demonstrate that the long-standing trade-off between effectiveness and efficiency in articulated object manipulation can be successfully resolved without relying on prior kinematic knowledge.
Similar Papers
TacMan-Turbo: Proactive Tactile Control for Robust and Efficient Articulated Object Manipulation
Robotics
Robots learn to move objects smoothly and fast.
Vi-TacMan: Articulated Object Manipulation via Vision and Touch
Robotics
Robots use eyes and touch to grab anything.
FreeTacMan: Robot-free Visuo-Tactile Data Collection System for Contact-rich Manipulation
Robotics
Lets robots learn to grab things by feeling them.