Progressive Bird's Eye View Perception for Safety-Critical Autonomous Driving: A Comprehensive Survey
By: Yan Gong , Naibang Wang , Jianli Lu and more
Potential Business Impact:
Makes self-driving cars safer in bad weather.
Bird's-Eye-View (BEV) perception has become a foundational paradigm in autonomous driving, enabling unified spatial representations that support robust multi-sensor fusion and multi-agent collaboration. As autonomous vehicles transition from controlled environments to real-world deployment, ensuring the safety and reliability of BEV perception in complex scenarios - such as occlusions, adverse weather, and dynamic traffic - remains a critical challenge. This survey provides the first comprehensive review of BEV perception from a safety-critical perspective, systematically analyzing state-of-the-art frameworks and implementation strategies across three progressive stages: single-modality vehicle-side, multimodal vehicle-side, and multi-agent collaborative perception. Furthermore, we examine public datasets encompassing vehicle-side, roadside, and collaborative settings, evaluating their relevance to safety and robustness. We also identify key open-world challenges - including open-set recognition, large-scale unlabeled data, sensor degradation, and inter-agent communication latency - and outline future research directions, such as integration with end-to-end autonomous driving systems, embodied intelligence, and large language models.
Similar Papers
An Initial Study of Bird's-Eye View Generation for Autonomous Vehicles using Cross-View Transformers
CV and Pattern Recognition
Helps self-driving cars see roads from above.
BEVCon: Advancing Bird's Eye View Perception with Contrastive Learning
CV and Pattern Recognition
Helps self-driving cars see better from above.
Camera-Only Bird's Eye View Perception: A Neural Approach to LiDAR-Free Environmental Mapping for Autonomous Vehicles
CV and Pattern Recognition
Cars see the world using only cameras.