Learning to Remove Lens Flare in Event Camera
By: Haiqian Han , Lingdong Kong , Jianing Li and more
Event cameras have the potential to revolutionize vision systems with their high temporal resolution and dynamic range, yet they remain susceptible to lens flare, a fundamental optical artifact that causes severe degradation. In event streams, this optical artifact forms a complex, spatio-temporal distortion that has been largely overlooked. We present E-Deflare, the first systematic framework for removing lens flare from event camera data. We first establish the theoretical foundation by deriving a physics-grounded forward model of the non-linear suppression mechanism. This insight enables the creation of the E-Deflare Benchmark, a comprehensive resource featuring a large-scale simulated training set, E-Flare-2.7K, and the first-ever paired real-world test set, E-Flare-R, captured by our novel optical system. Empowered by this benchmark, we design E-DeflareNet, which achieves state-of-the-art restoration performance. Extensive experiments validate our approach and demonstrate clear benefits for downstream tasks. Code and datasets are publicly available.
Similar Papers
EvTurb: Event Camera Guided Turbulence Removal
CV and Pattern Recognition
Clears blurry, shaky videos using special camera data.
Drone Detection with Event Cameras
CV and Pattern Recognition
Finds tiny drones in any light.
Learning to See Through Flare
Image and Video Processing
Protects cameras from bright laser lights.