CAO-RONet: A Robust 4D Radar Odometry with Exploring More Information from Low-Quality Points
By: Zhiheng Li , Yubo Cui , Ningyuan Huang and more
Potential Business Impact:
Helps cars see better in fog and rain.
Recently, 4D millimetre-wave radar exhibits more stable perception ability than LiDAR and camera under adverse conditions (e.g. rain and fog). However, low-quality radar points hinder its application, especially the odometry task that requires a dense and accurate matching. To fully explore the potential of 4D radar, we introduce a learning-based odometry framework, enabling robust ego-motion estimation from finite and uncertain geometry information. First, for sparse radar points, we propose a local completion to supplement missing structures and provide denser guideline for aligning two frames. Then, a context-aware association with a hierarchical structure flexibly matches points of different scales aided by feature similarity, and improves local matching consistency through correlation balancing. Finally, we present a window-based optimizer that uses historical priors to establish a coupling state estimation and correct errors of inter-frame matching. The superiority of our algorithm is confirmed on View-of-Delft dataset, achieving around a 50% performance improvement over previous approaches and delivering accuracy on par with LiDAR odometry. Our code will be available.
Similar Papers
Super4DR: 4D Radar-centric Self-supervised Odometry and Gaussian-based Map Optimization
Robotics
Lets cars see clearly in fog and rain.
DRO: Doppler-Aware Direct Radar Odometry
Robotics
Lets robots see through walls and bad weather.
Equi-RO: A 4D mmWave Radar Odometry via Equivariant Networks
Robotics
Helps robots see and move in bad weather.