SparseCoop: Cooperative Perception with Kinematic-Grounded Queries
By: Jiahao Wang , Zhongwei Jiang , Wenchao Sun and more
Potential Business Impact:
Cars share data to see around corners.
Cooperative perception is critical for autonomous driving, overcoming the inherent limitations of a single vehicle, such as occlusions and constrained fields-of-view. However, current approaches sharing dense Bird's-Eye-View (BEV) features are constrained by quadratically-scaling communication costs and the lack of flexibility and interpretability for precise alignment across asynchronous or disparate viewpoints. While emerging sparse query-based methods offer an alternative, they often suffer from inadequate geometric representations, suboptimal fusion strategies, and training instability. In this paper, we propose SparseCoop, a fully sparse cooperative perception framework for 3D detection and tracking that completely discards intermediate BEV representations. Our framework features a trio of innovations: a kinematic-grounded instance query that uses an explicit state vector with 3D geometry and velocity for precise spatio-temporal alignment; a coarse-to-fine aggregation module for robust fusion; and a cooperative instance denoising task to accelerate and stabilize training. Experiments on V2X-Seq and Griffin datasets show SparseCoop achieves state-of-the-art performance. Notably, it delivers this with superior computational efficiency, low transmission cost, and strong robustness to communication latency. Code is available at https://github.com/wang-jh18-SVM/SparseCoop.
Similar Papers
SparseAlign: A Fully Sparse Framework for Cooperative Object Detection
CV and Pattern Recognition
Helps self-driving cars see farther and safer.
SlimComm: Doppler-Guided Sparse Queries for Bandwidth-Efficient Cooperative 3-D Perception
CV and Pattern Recognition
Cars share less data, see around corners.
Vision-Only Gaussian Splatting for Collaborative Semantic Occupancy Prediction
CV and Pattern Recognition
Cars share what they see to understand surroundings better.