Robust sensor fusion against on-vehicle sensor staleness
By: Meng Fan , Yifan Zuo , Patrick Blaes and more
Potential Business Impact:
Helps self-driving cars see better with mixed-up data.
Sensor fusion is crucial for a performant and robust Perception system in autonomous vehicles, but sensor staleness, where data from different sensors arrives with varying delays, poses significant challenges. Temporal misalignment between sensor modalities leads to inconsistent object state estimates, severely degrading the quality of trajectory predictions that are critical for safety. We present a novel and model-agnostic approach to address this problem via (1) a per-point timestamp offset feature (for LiDAR and radar both relative to camera) that enables fine-grained temporal awareness in sensor fusion, and (2) a data augmentation strategy that simulates realistic sensor staleness patterns observed in deployed vehicles. Our method is integrated into a perspective-view detection model that consumes sensor data from multiple LiDARs, radars and cameras. We demonstrate that while a conventional model shows significant regressions when one sensor modality is stale, our approach reaches consistently good performance across both synchronized and stale conditions.
Similar Papers
Impact of Temporal Delay on Radar-Inertial Odometry
Robotics
Helps self-driving cars see in bad weather.
A Sensor-Aware Phenomenological Framework for Lidar Degradation Simulation and SLAM Robustness Evaluation
Robotics
Tests robots' eyes in bad weather.
SAMFusion: Sensor-Adaptive Multimodal Fusion for 3D Object Detection in Adverse Weather
CV and Pattern Recognition
Helps self-driving cars see in fog and snow.