RealDriveSim: A Realistic Multi-Modal Multi-Task Synthetic Dataset for Autonomous Driving
By: Arpit Jadon , Haoran Wang , Phillip Thomas and more
Potential Business Impact:
Creates realistic fake driving scenes for self-driving cars.
As perception models continue to develop, the need for large-scale datasets increases. However, data annotation remains far too expensive to effectively scale and meet the demand. Synthetic datasets provide a solution to boost model performance with substantially reduced costs. However, current synthetic datasets remain limited in their scope, realism, and are designed for specific tasks and applications. In this work, we present RealDriveSim, a realistic multi-modal synthetic dataset for autonomous driving that not only supports popular 2D computer vision applications but also their LiDAR counterparts, providing fine-grained annotations for up to 64 classes. We extensively evaluate our dataset for a wide range of applications and domains, demonstrating state-of-the-art results compared to existing synthetic benchmarks. The dataset is publicly available at https://realdrivesim.github.io/.
Similar Papers
Evaluating the Impact of Synthetic Data on Object Detection Tasks in Autonomous Driving
CV and Pattern Recognition
Mix real and fake data to improve self-driving cars.
Rethinking Driving World Model as Synthetic Data Generator for Perception Tasks
CV and Pattern Recognition
Makes self-driving cars better at seeing tricky situations.
SynthDrive: Scalable Real2Sim2Real Sensor Simulation Pipeline for High-Fidelity Asset Generation and Driving Data Synthesis
CV and Pattern Recognition
Teaches self-driving cars to handle tricky situations.