Score: 2

Distributionally Robust Wireless Semantic Communication with Large AI Models

Published: May 28, 2025 | arXiv ID: 2506.03167v1

By: Long Tan Le , Senura Hansaja Wanasekara , Zerun Niu and more

BigTech Affiliations: Princeton University

Potential Business Impact:

Makes wireless messages understandable even with errors.

Business Areas:
Semantic Web Internet Services

6G wireless systems are expected to support massive volumes of data with ultra-low latency. However, conventional bit-level transmission strategies cannot support the efficiency and adaptability required by modern, data-intensive applications. The concept of semantic communication (SemCom) addresses this limitation by focusing on transmitting task-relevant semantic information instead of raw data. While recent efforts incorporating deep learning and large-scale AI models have improved SemCom's performance, existing systems remain vulnerable to both semantic-level and transmission-level noise because they often rely on domain-specific architectures that hinder generalizability. In this paper, a novel and generalized semantic communication framework called WaSeCom is proposed to systematically address uncertainty and enhance robustness. In particular, Wasserstein distributionally robust optimization is employed to provide resilience against semantic misinterpretation and channel perturbations. A rigorous theoretical analysis is performed to establish the robust generalization guarantees of the proposed framework. Experimental results on image and text transmission demonstrate that WaSeCom achieves improved robustness under noise and adversarial perturbations. These results highlight its effectiveness in preserving semantic fidelity across varying wireless conditions.

Country of Origin
πŸ‡ΈπŸ‡¬ πŸ‡°πŸ‡· πŸ‡ΊπŸ‡Έ πŸ‡¦πŸ‡Ί Singapore, Australia, Korea, Republic of, United States

Page Count
16 pages

Category
Computer Science:
Networking and Internet Architecture