Vision-Guided Optic Flow Navigation for Small Lunar Missions
By: Sean Cowan , Pietro Fanti , Leon B. S. Williams and more
Potential Business Impact:
Helps moon robots land safely using only cameras.
Private lunar missions are faced with the challenge of robust autonomous navigation while operating under stringent constraints on mass, power, and computational resources. This work proposes a motion-field inversion framework that uses optical flow and rangefinder-based depth estimation as a lightweight CPU-based solution for egomotion estimation during lunar descent. We extend classical optical flow formulations by integrating them with depth modeling strategies tailored to the geometry for lunar/planetary approach, descent, and landing, specifically, planar and spherical terrain approximations parameterized by a laser rangefinder. Motion field inversion is performed through a least-squares framework, using sparse optical flow features extracted via the pyramidal Lucas-Kanade algorithm. We verify our approach using synthetically generated lunar images over the challenging terrain of the lunar south pole, using CPU budgets compatible with small lunar landers. The results demonstrate accurate velocity estimation from approach to landing, with sub-10% error for complex terrain and on the order of 1% for more typical terrain, as well as performances suitable for real-time applications. This framework shows promise for enabling robust, lightweight on-board navigation for small lunar missions.
Similar Papers
Vertical Planetary Landing on Sloped Terrain Using Optical Flow Divergence Estimates
Robotics
Lets small spacecraft land safely on bumpy ground.
Planar Velocity Estimation for Fast-Moving Mobile Robots Using Event-Based Optical Flow
Robotics
Helps cars know speed even on slippery roads.
Motion Aware ViT-based Framework for Monocular 6-DoF Spacecraft Pose Estimation
CV and Pattern Recognition
Helps robots know where they are in space.