DriverGaze360: OmniDirectional Driver Attention with Object-Level Guidance
By: Shreedhar Govil, Didier Stricker, Jason Rambach
Potential Business Impact:
Helps cars know where drivers look all around.
Predicting driver attention is a critical problem for developing explainable autonomous driving systems and understanding driver behavior in mixed human-autonomous vehicle traffic scenarios. Although significant progress has been made through large-scale driver attention datasets and deep learning architectures, existing works are constrained by narrow frontal field-of-view and limited driving diversity. Consequently, they fail to capture the full spatial context of driving environments, especially during lane changes, turns, and interactions involving peripheral objects such as pedestrians or cyclists. In this paper, we introduce DriverGaze360, a large-scale 360$^\circ$ field of view driver attention dataset, containing $\sim$1 million gaze-labeled frames collected from 19 human drivers, enabling comprehensive omnidirectional modeling of driver gaze behavior. Moreover, our panoramic attention prediction approach, DriverGaze360-Net, jointly learns attention maps and attended objects by employing an auxiliary semantic segmentation head. This improves spatial awareness and attention prediction across wide panoramic inputs. Extensive experiments demonstrate that DriverGaze360-Net achieves state-of-the-art attention prediction performance on multiple metrics on panoramic driving images. Dataset and method available at https://av.dfki.de/drivergaze360.
Similar Papers
VISTA: Vision-Language Imitation of Situational Thinking and Attention for Human-Like Driver Focus in Dynamic Environments
CV and Pattern Recognition
Predicts where drivers look using words.
EgoCampus: Egocentric Pedestrian Eye Gaze Model and Dataset
CV and Pattern Recognition
Helps computers guess where people look while walking.
Eyes on Target: Gaze-Aware Object Detection in Egocentric Video
CV and Pattern Recognition
Helps computers see what people are looking at.