SlimComm: Doppler-Guided Sparse Queries for Bandwidth-Efficient Cooperative 3-D Perception
By: Melih Yazgan , Qiyuan Wu , Iramm Hamdard and more
Potential Business Impact:
Cars share less data, see around corners.
Collaborative perception allows connected autonomous vehicles (CAVs) to overcome occlusion and limited sensor range by sharing intermediate features. Yet transmitting dense Bird's-Eye-View (BEV) feature maps can overwhelm the bandwidth available for inter-vehicle communication. We present SlimComm, a communication-efficient framework that integrates 4D radar Doppler with a query-driven sparse scheme. SlimComm builds a motion-centric dynamic map to distinguish moving from static objects and generates two query types: (i) reference queries on dynamic and high-confidence regions, and (ii) exploratory queries probing occluded areas via a two-stage offset. Only query-specific BEV features are exchanged and fused through multi-scale gated deformable attention, reducing payload while preserving accuracy. For evaluation, we release OPV2V-R and Adver-City-R, CARLA-based datasets with per-point Doppler radar. SlimComm achieves up to 90% lower bandwidth than full-map sharing while matching or surpassing prior baselines across varied traffic densities and occlusions. Dataset and code will be available at: https://url.fzi.de/SlimComm.
Similar Papers
Communication-Efficient Multi-Agent 3D Detection via Hybrid Collaboration
CV and Pattern Recognition
Cars share info to see better with less data.
Vision-Only Gaussian Splatting for Collaborative Semantic Occupancy Prediction
CV and Pattern Recognition
Cars share what they see to understand surroundings better.
CoVeRaP: Cooperative Vehicular Perception through mmWave FMCW Radars
CV and Pattern Recognition
Cars see better together, even in bad weather.