Score: 0

SDTrack: A Baseline for Event-based Tracking via Spiking Neural Networks

Published: March 9, 2025 | arXiv ID: 2503.08703v3

By: Yimeng Shan , Zhenbang Ren , Haodi Wu and more

Potential Business Impact:

Tracks moving things faster and using less power.

Business Areas:
Image Recognition Data and Analytics, Software

Event cameras provide superior temporal resolution, dynamic range, power efficiency, and pixel bandwidth. Spiking Neural Networks (SNNs) naturally complement event data through discrete spike signals, making them ideal for event-based tracking. However, current approaches that combine Artificial Neural Networks (ANNs) and SNNs, along with suboptimal architectures, compromise energy efficiency and limit tracking performance. To address these limitations, we propose the first Transformer-based spike-driven tracking pipeline. Our Global Trajectory Prompt (GTP) method effectively captures global trajectory information and aggregates it with event streams into event images to enhance spatiotemporal representation. We then introduce SDTrack, a Transformer-based spike-driven tracker comprising a Spiking MetaFormer backbone and a tracking head that directly predicts normalized coordinates using spike signals. The framework is end-to-end, does not require data augmentation or post-processing. Extensive experiments demonstrate that SDTrack achieves state-of-the-art performance while maintaining the lowest parameter count and energy consumption across multiple event-based tracking benchmarks, establishing a solid baseline for future research in the field of neuromorphic vision.

Page Count
10 pages

Category
Computer Science:
Neural and Evolutionary Computing