TCLNet: A Hybrid Transformer-CNN Framework Leveraging Language Models as Lossless Compressors for CSI Feedback
By: Zijiu Yang , Qianqian Yang , Shunpu Tang and more
In frequency division duplexing (FDD) massive multiple-input multiple-output (MIMO) systems, downlink channel state information (CSI) plays a crucial role in achieving high spectrum and energy efficiency. However, the CSI feedback overhead becomes a major bottleneck as the number of antennas increases. Although existing deep learning-based CSI compression methods have shown great potential, they still face limitations in capturing both local and global features of CSI, thereby limiting achievable compression efficiency. To address these issues, we propose TCLNet, a unified CSI compression framework that integrates a hybrid Transformer-CNN architecture for lossy compression with a hybrid language model (LM) and factorized model (FM) design for lossless compression. The lossy module jointly exploits local features and global context, while the lossless module adaptively switches between context-aware coding and parallel coding to optimize the rate-distortion-complexity (RDC) trade-off. Extensive experiments on both real-world and simulated datasets demonstrate that the proposed TCLNet outperforms existing approaches in terms of reconstruction accuracy and transmission efficiency, achieving up to a 5 dB performance gain across diverse scenarios. Moreover, we show that large language models (LLMs) can be leveraged as zero-shot CSI lossless compressors via carefully designed prompts.
Similar Papers
Online Neural Model Fine-Tuning in Massive MIMO CSI Feedback: Taming The Communication Cost of Model Updates
Information Theory
Improves wireless signals by learning new patterns.
CSI Compression Beyond Latents: End-to-End Hybrid Attention-CNN Networks with Entropy Regularization
Systems and Control
Makes wireless signals faster and use less data.
Exploring the Potential of Large Language Models for Massive MIMO CSI Feedback
Information Theory
Makes wireless signals faster and clearer.