RaFD: Flow-Guided Radar Detection for Robust Autonomous Driving
By: Shuocheng Yang , Zikun Xu , Jiahao Wang and more
Potential Business Impact:
Helps self-driving cars see better with radar.
Radar has shown strong potential for robust perception in autonomous driving; however, raw radar images are frequently degraded by noise and "ghost" artifacts, making object detection based solely on semantic features highly challenging. To address this limitation, we introduce RaFD, a radar-based object detection framework that estimates inter-frame bird's-eye-view (BEV) flow and leverages the resulting geometric cues to enhance detection accuracy. Specifically, we design a supervised flow estimation auxiliary task that is jointly trained with the detection network. The estimated flow is further utilized to guide feature propagation from the previous frame to the current one. Our flow-guided, radar-only detector achieves achieves state-of-the-art performance on the RADIATE dataset, underscoring the importance of incorporating geometric information to effectively interpret radar signals, which are inherently ambiguous in semantics.
Similar Papers
RaLiFlow: Scene Flow Estimation with 4D Radar and LiDAR Point Clouds
CV and Pattern Recognition
Helps cars see moving objects better in bad weather.
Revisiting Radar Camera Alignment by Contrastive Learning for 3D Object Detection
CV and Pattern Recognition
Helps self-driving cars see better with radar and cameras.
DriveFlow: Rectified Flow Adaptation for Robust 3D Object Detection in Autonomous Driving
CV and Pattern Recognition
Makes self-driving cars see better in new places.