Large-Scale Model Enabled Semantic Communication Based on Robust Knowledge Distillation
By: Kuiyuan DIng , Caili Guo , Yang Yang and more
Potential Business Impact:
Shrinks big AI for noise-proof smart messaging
Large-scale models (LSMs) can be an effective framework for semantic representation and understanding, thereby providing a suitable tool for designing semantic communication (SC) systems. However, their direct deployment is often hindered by high computational complexity and resource requirements. In this paper, a novel robust knowledge distillation based semantic communication (RKD-SC) framework is proposed to enable efficient and \textcolor{black}{channel-noise-robust} LSM-powered SC. The framework addresses two key challenges: determining optimal compact model architectures and effectively transferring knowledge while maintaining robustness against channel noise. First, a knowledge distillation-based lightweight differentiable architecture search (KDL-DARTS) algorithm is proposed. This algorithm integrates knowledge distillation loss and a complexity penalty into the neural architecture search process to identify high-performance, lightweight semantic encoder architectures. Second, a novel two-stage robust knowledge distillation (RKD) algorithm is developed to transfer semantic capabilities from an LSM (teacher) to a compact encoder (student) and subsequently enhance system robustness. To further improve resilience to channel impairments, a channel-aware transformer (CAT) block is introduced as the channel codec, trained under diverse channel conditions with variable-length outputs. Extensive simulations on image classification tasks demonstrate that the RKD-SC framework significantly reduces model parameters while preserving a high degree of the teacher model's performance and exhibiting superior robustness compared to existing methods.
Similar Papers
Large-Scale Model Enabled Semantic Communication Based on Robust Knowledge Distillation
Machine Learning (CS)
Shrinks big AI for noise-proof smart messaging
Lightweight Task-Oriented Semantic Communication Empowered by Large-Scale AI Models
Machine Learning (CS)
Makes AI communication faster and smarter.
Knowledge Distillation and Dataset Distillation of Large Language Models: Emerging Trends, Challenges, and Future Directions
Computation and Language
Makes big AI models smaller and faster.