Collaborative Perception Datasets for Autonomous Driving: A Review
By: Naibang Wang , Deyong Shang , Yan Gong and more
Potential Business Impact:
Helps self-driving cars share what they see.
Collaborative perception has attracted growing interest from academia and industry due to its potential to enhance perception accuracy, safety, and robustness in autonomous driving through multi-agent information fusion. With the advancement of Vehicle-to-Everything (V2X) communication, numerous collaborative perception datasets have emerged, varying in cooperation paradigms, sensor configurations, data sources, and application scenarios. However, the absence of systematic summarization and comparative analysis hinders effective resource utilization and standardization of model evaluation. As the first comprehensive review focused on collaborative perception datasets, this work reviews and compares existing resources from a multi-dimensional perspective. We categorize datasets based on cooperation paradigms, examine their data sources and scenarios, and analyze sensor modalities and supported tasks. A detailed comparative analysis is conducted across multiple dimensions. We also outline key challenges and future directions, including dataset scalability, diversity, domain adaptation, standardization, privacy, and the integration of large language models. To support ongoing research, we provide a continuously updated online repository of collaborative perception datasets and related literature: https://github.com/frankwnb/Collaborative-Perception-Datasets-for-Autonomous-Driving.
Similar Papers
Systematic Literature Review on Vehicular Collaborative Perception -- A Computer Vision Perspective
CV and Pattern Recognition
Cars share what they see to drive safer.
TruckV2X: A Truck-Centered Perception Dataset
Robotics
Helps big trucks see around blind spots.
When Autonomous Vehicle Meets V2X Cooperative Perception: How Far Are We?
Artificial Intelligence
Cars share senses to see farther, avoid crashes.