InfoCom: Kilobyte-Scale Communication-Efficient Collaborative Perception with Information Bottleneck
By: Quanmin Wei , Penglin Dai , Wei Li and more
Potential Business Impact:
Cars share less data to see better.
Precise environmental perception is critical for the reliability of autonomous driving systems. While collaborative perception mitigates the limitations of single-agent perception through information sharing, it encounters a fundamental communication-performance trade-off. Existing communication-efficient approaches typically assume MB-level data transmission per collaboration, which may fail due to practical network constraints. To address these issues, we propose InfoCom, an information-aware framework establishing the pioneering theoretical foundation for communication-efficient collaborative perception via extended Information Bottleneck principles. Departing from mainstream feature manipulation, InfoCom introduces a novel information purification paradigm that theoretically optimizes the extraction of minimal sufficient task-critical information under Information Bottleneck constraints. Its core innovations include: i) An Information-Aware Encoding condensing features into minimal messages while preserving perception-relevant information; ii) A Sparse Mask Generation identifying spatial cues with negligible communication cost; and iii) A Multi-Scale Decoding that progressively recovers perceptual information through mask-guided mechanisms rather than simple feature reconstruction. Comprehensive experiments across multiple datasets demonstrate that InfoCom achieves near-lossless perception while reducing communication overhead from megabyte to kilobyte-scale, representing 440-fold and 90-fold reductions per agent compared to Where2comm and ERMVP, respectively.
Similar Papers
Multi-Modal Multi-Task Semantic Communication: A Distributed Information Bottleneck Perspective
Information Theory
Sends messages with less data, keeping meaning.
Lightweight Task-Oriented Semantic Communication Empowered by Large-Scale AI Models
Machine Learning (CS)
Makes AI communication faster and smarter.
Multi-Modal Multi-Task Semantic Communication: A Distributed Information Bottleneck Perspective
Information Theory
Sends messages with less data, more meaning.