Score: 0

Robust sensor fusion against on-vehicle sensor staleness

Published: June 6, 2025 | arXiv ID: 2506.05780v1

By: Meng Fan , Yifan Zuo , Patrick Blaes and more

Potential Business Impact:

Helps self-driving cars see better with mixed-up data.

Business Areas:
Smart Cities Real Estate

Sensor fusion is crucial for a performant and robust Perception system in autonomous vehicles, but sensor staleness, where data from different sensors arrives with varying delays, poses significant challenges. Temporal misalignment between sensor modalities leads to inconsistent object state estimates, severely degrading the quality of trajectory predictions that are critical for safety. We present a novel and model-agnostic approach to address this problem via (1) a per-point timestamp offset feature (for LiDAR and radar both relative to camera) that enables fine-grained temporal awareness in sensor fusion, and (2) a data augmentation strategy that simulates realistic sensor staleness patterns observed in deployed vehicles. Our method is integrated into a perspective-view detection model that consumes sensor data from multiple LiDARs, radars and cameras. We demonstrate that while a conventional model shows significant regressions when one sensor modality is stale, our approach reaches consistently good performance across both synchronized and stale conditions.

Page Count
4 pages

Category
Computer Science:
CV and Pattern Recognition