Score: 0

High-Precision Transformer-Based Visual Servoing for Humanoid Robots in Aligning Tiny Objects

Published: March 6, 2025 | arXiv ID: 2503.04862v2

By: Jialong Xue , Wei Gao , Yu Wang and more

Potential Business Impact:

Helps robots precisely place tiny tool parts.

Business Areas:
Robotics Hardware, Science and Engineering, Software

High-precision tiny object alignment remains a common and critical challenge for humanoid robots in real-world. To address this problem, this paper proposes a vision-based framework for precisely estimating and controlling the relative position between a handheld tool and a target object for humanoid robots, e.g., a screwdriver tip and a screw head slot. By fusing images from the head and torso cameras on a robot with its head joint angles, the proposed Transformer-based visual servoing method can correct the handheld tool's positional errors effectively, especially at a close distance. Experiments on M4-M8 screws demonstrate an average convergence error of 0.8-1.3 mm and a success rate of 93\%-100\%. Through comparative analysis, the results validate that this capability of high-precision tiny object alignment is enabled by the Distance Estimation Transformer architecture and the Multi-Perception-Head mechanism proposed in this paper.

Country of Origin
🇨🇳 China

Page Count
7 pages

Category
Computer Science:
CV and Pattern Recognition