A Novel Feedforward Youla Parameterization Method for Avoiding Local Minima in Stereo Image Based Visual Servoing Control
By: Rongfei Li, Francis Assadian
Potential Business Impact:
Helps robots see and move precisely.
In robot navigation and manipulation, accurately determining the camera's pose relative to the environment is crucial for effective task execution. In this paper, we systematically prove that this problem corresponds to the Perspective-3-Point (P3P) formulation, where exactly three known 3D points and their corresponding 2D image projections are used to estimate the pose of a stereo camera. In image-based visual servoing (IBVS) control, the system becomes overdetermined, as the 6 degrees of freedom (DoF) of the stereo camera must align with 9 observed 2D features in the scene. When more constraints are imposed than available DoFs, global stability cannot be guaranteed, as the camera may become trapped in a local minimum far from the desired configuration during servoing. To address this issue, we propose a novel control strategy for accurately positioning a calibrated stereo camera. Our approach integrates a feedforward controller with a Youla parameterization-based feedback controller, ensuring robust servoing performance. Through simulations, we demonstrate that our method effectively avoids local minima and enables the camera to reach the desired pose accurately and efficiently.
Similar Papers
Innovative Adaptive Imaged Based Visual Servoing Control of 6 DoFs Industrial Robot Manipulators
Robotics
Robots can find and grab things they can't see.
Geometric Visual Servo Via Optimal Transport
Robotics
Helps robots move objects more smoothly and accurately.
Efficient Self-Supervised Neuro-Analytic Visual Servoing for Real-time Quadrotor Control
CV and Pattern Recognition
Drone flies itself using only its camera.