Super4DR: 4D Radar-centric Self-supervised Odometry and Gaussian-based Map Optimization
By: Zhiheng Li , Weihua Wang , Qiang Shen and more
Potential Business Impact:
Lets cars see clearly in fog and rain.
Conventional SLAM systems using visual or LiDAR data often struggle in poor lighting and severe weather. Although 4D radar is suited for such environments, its sparse and noisy point clouds hinder accurate odometry estimation, while the radar maps suffer from obscure and incomplete structures. Thus, we propose Super4DR, a 4D radar-centric framework for learning-based odometry estimation and gaussian-based map optimization. First, we design a cluster-aware odometry network that incorporates object-level cues from the clustered radar points for inter-frame matching, alongside a hierarchical self-supervision mechanism to overcome outliers through spatio-temporal consistency, knowledge transfer, and feature contrast. Second, we propose using 3D gaussians as an intermediate representation, coupled with a radar-specific growth strategy, selective separation, and multi-view regularization, to recover blurry map areas and those undetected based on image texture. Experiments show that Super4DR achieves a 67% performance gain over prior self-supervised methods, nearly matches supervised odometry, and narrows the map quality disparity with LiDAR while enabling multi-modal image rendering.
Similar Papers
4DRadar-GS: Self-Supervised Dynamic Driving Scene Reconstruction with 4D Radar
CV and Pattern Recognition
Helps self-driving cars see moving objects better.
Rad-GS: Radar-Vision Integration for 3D Gaussian Splatting SLAM in Outdoor Environments
CV and Pattern Recognition
Maps large outdoor areas accurately with radar.
CAO-RONet: A Robust 4D Radar Odometry with Exploring More Information from Low-Quality Points
Robotics
Helps cars see better in fog and rain.