To New Beginnings: A Survey of Unified Perception in Autonomous Vehicle Software
By: Loïc Stratil , Felix Fent , Esteban Rivera and more
Potential Business Impact:
Helps self-driving cars see and understand everything.
Autonomous vehicle perception typically relies on modular pipelines that decompose the task into detection, tracking, and prediction. While interpretable, these pipelines suffer from error accumulation and limited inter-task synergy. Unified perception has emerged as a promising paradigm that integrates these sub-tasks within a shared architecture, potentially improving robustness, contextual reasoning, and efficiency while retaining interpretable outputs. In this survey, we provide a comprehensive overview of unified perception, introducing a holistic and systemic taxonomy that categorizes methods along task integration, tracking formulation, and representation flow. We define three paradigms -Early, Late, and Full Unified Perception- and systematically review existing methods, their architectures, training strategies, datasets used, and open-source availability, while highlighting future research directions. This work establishes the first comprehensive framework for understanding and advancing unified perception, consolidates fragmented efforts, and guides future research toward more robust, generalizable, and interpretable perception.
Similar Papers
Perception in Plan: Coupled Perception and Planning for End-to-End Autonomous Driving
CV and Pattern Recognition
Helps self-driving cars plan safer routes.
UniUGP: Unifying Understanding, Generation, and Planing For End-to-end Autonomous Driving
CV and Pattern Recognition
Helps self-driving cars learn from more videos.
Systematic Literature Review on Vehicular Collaborative Perception -- A Computer Vision Perspective
CV and Pattern Recognition
Cars share what they see to drive safer.