Correlation-Aware Dual-View Pose and Velocity Estimation for Dynamic Robotic Manipulation
By: Mahboubeh Zarei, Robin Chhabra, Farrokh Janabi-Sharifi
Potential Business Impact:
Helps robots see and move objects precisely.
Accurate pose and velocity estimation is essential for effective spatial task planning in robotic manipulators. While centralized sensor fusion has traditionally been used to improve pose estimation accuracy, this paper presents a novel decentralized fusion approach to estimate both pose and velocity. We use dual-view measurements from an eye-in-hand and an eye-to-hand vision sensor configuration mounted on a manipulator to track a target object whose motion is modeled as random walk (stochastic acceleration model). The robot runs two independent adaptive extended Kalman filters formulated on a matrix Lie group, developed as part of this work. These filters predict poses and velocities on the manifold $\mathbb{SE}(3) \times \mathbb{R}^3 \times \mathbb{R}^3$ and update the state on the manifold $\mathbb{SE}(3)$. The final fused state comprising the fused pose and velocities of the target is obtained using a correlation-aware fusion rule on Lie groups. The proposed method is evaluated on a UFactory xArm 850 equipped with Intel RealSense cameras, tracking a moving target. Experimental results validate the effectiveness and robustness of the proposed decentralized dual-view estimation framework, showing consistent improvements over state-of-the-art methods.
Similar Papers
A Geometric Approach For Pose and Velocity Estimation Using IMU and Inertial/Body-Frame Measurements
Systems and Control
Tracks a moving object's exact position and speed.
Uncertainty-Driven Radar-Inertial Fusion for Instantaneous 3D Ego-Velocity Estimation
Robotics
Helps self-driving cars know how fast they're going.
AI-Enhanced Kinematic Modeling of Flexible Manipulators Using Multi-IMU Sensor Fusion
Robotics
Helps robots know where their bendy arms are.