Score: 0

DVS-PedX: Synthetic-and-Real Event-Based Pedestrian Dataset

Published: September 4, 2025 | arXiv ID: 2509.04117v1

By: Mustafa Sakhai , Kaung Sithu , Min Khant Soe Oke and more

Potential Business Impact:

Helps cars see people better in bad weather.

Business Areas:
Image Recognition Data and Analytics, Software

Event cameras like Dynamic Vision Sensors (DVS) report micro-timed brightness changes instead of full frames, offering low latency, high dynamic range, and motion robustness. DVS-PedX (Dynamic Vision Sensor Pedestrian eXploration) is a neuromorphic dataset designed for pedestrian detection and crossing-intention analysis in normal and adverse weather conditions across two complementary sources: (1) synthetic event streams generated in the CARLA simulator for controlled "approach-cross" scenes under varied weather and lighting; and (2) real-world JAAD dash-cam videos converted to event streams using the v2e tool, preserving natural behaviors and backgrounds. Each sequence includes paired RGB frames, per-frame DVS "event frames" (33 ms accumulations), and frame-level labels (crossing vs. not crossing). We also provide raw AEDAT 2.0/AEDAT 4.0 event files and AVI DVS video files and metadata for flexible re-processing. Baseline spiking neural networks (SNNs) using SpikingJelly illustrate dataset usability and reveal a sim-to-real gap, motivating domain adaptation and multimodal fusion. DVS-PedX aims to accelerate research in event-based pedestrian safety, intention prediction, and neuromorphic perception.

Country of Origin
🇵🇱 Poland

Page Count
12 pages

Category
Computer Science:
CV and Pattern Recognition