Vision-Aided Relative State Estimation for Approach and Landing on a Moving Platform with Inertial Measurements
By: Tarek Bouazza , Alessandro Melis , Soulaimane Berkane and more
This paper tackles the problem of estimating the relative position, orientation, and velocity between a UAV and a planar platform undergoing arbitrary 3D motion during approach and landing. The estimation relies on measurements from Inertial Measurement Units (IMUs) mounted on both systems, assuming there is a suitable communication channel to exchange data, together with visual information provided by an onboard monocular camera, from which the bearing (line-of-sight direction) to the platform's center and the normal vector of its planar surface are extracted. We propose a cascade observer with a complementary filter on SO(3) to reconstruct the relative attitude, followed by a linear Riccati observer for relative position and velocity estimation. Convergence of both observers is established under persistently exciting conditions, and the cascade is shown to be almost globally asymptotically and locally exponentially stable. We further extend the design to the case where the platform's rotation is restricted to its normal axis and show that its measured linear acceleration can be exploited to recover the remaining unobservable rotation angle. A sufficient condition to ensure local exponential convergence in this setting is provided. The performance of the proposed observers is validated through extensive simulations.
Similar Papers
Cascaded Tightly-Coupled Observer Design for Single-Range-Aided Inertial Navigation
Robotics
Tracks movement precisely with just a few sensors.
A Geometric Approach For Pose and Velocity Estimation Using IMU and Inertial/Body-Frame Measurements
Systems and Control
Tracks a moving object's exact position and speed.
Observer Design for Optical Flow-Based Visual-Inertial Odometry with Almost-Global Convergence
Robotics
Helps robots see and know where they are.