Semantic Communication with Distribution Learning through Sequential Observations
By: Samer Lahoud, Kinda Khawam
Potential Business Impact:
Helps computers understand messages, not just words.
Semantic communication aims to convey meaning rather than bit-perfect reproduction, representing a paradigm shift from traditional communication. This paper investigates distribution learning in semantic communication where receivers must infer the underlying meaning distribution through sequential observations. While semantic communication traditionally optimizes individual meaning transmission, we establish fundamental conditions for learning source statistics when priors are unknown. We prove that learnability requires full rank of the effective transmission matrix, characterize the convergence rate of distribution estimation, and quantify how estimation errors translate to semantic distortion. Our analysis reveals a fundamental trade-off: encoding schemes optimized for immediate semantic performance often sacrifice long-term learnability. Experiments on CIFAR-10 validate our theoretical framework, demonstrating that system conditioning critically impacts both learning rate and achievable performance. These results provide the first rigorous characterization of statistical learning in semantic communication and offer design principles for systems that balance immediate performance with adaptation capability.
Similar Papers
Deep Semantic Inference over the Air: An Efficient Task-Oriented Communication System
Information Theory
Makes wireless devices smarter, faster, and use less power.
Semantic Communication in Dynamic Channel Scenarios: Collaborative Optimization of Dual-Pipeline Joint Source-Channel Coding and Personalized Federated Learning
Image and Video Processing
Makes wireless messages smarter for everyone.
Semantic Communication: From Philosophical Conceptions Towards a Mathematical Framework
Information Theory
Makes computers understand meaning, not just words.