LiDAS: Lighting-driven Dynamic Active Sensing for Nighttime Perception
By: Simon de Moreau , Andrei Bursuc , Hafid El-Idrissi and more
Potential Business Impact:
Makes car lights help cars see better at night.
Nighttime environments pose significant challenges for camera-based perception, as existing methods passively rely on the scene lighting. We introduce Lighting-driven Dynamic Active Sensing (LiDAS), a closed-loop active illumination system that combines off-the-shelf visual perception models with high-definition headlights. Rather than uniformly brightening the scene, LiDAS dynamically predicts an optimal illumination field that maximizes downstream perception performance, i.e., decreasing light on empty areas to reallocate it on object regions. LiDAS enables zero-shot nighttime generalization of daytime-trained models through adaptive illumination control. Trained on synthetic data and deployed zero-shot in real-world closed-loop driving scenarios, LiDAS enables +18.7% mAP50 and +5.0% mIoU over standard low-beam at equal power. It maintains performances while reducing energy use by 40%. LiDAS complements domain-generalization methods, further strengthening robustness without retraining. By turning readily available headlights into active vision actuators, LiDAS offers a cost-effective solution to robust nighttime perception.
Similar Papers
Super LiDAR Reflectance for Robotic Perception
Robotics
Makes cheap sensors see like expensive ones.
DriveLiDAR4D: Sequential and Controllable LiDAR Scene Generation for Autonomous Driving
CV and Pattern Recognition
Creates realistic driving scenes for self-driving cars.
Navigating in the Dark: A Multimodal Framework and Dataset for Nighttime Traffic Sign Recognition
CV and Pattern Recognition
Helps cars see signs at night better.