Score: 2

SL-ACC: A Communication-Efficient Split Learning Framework with Adaptive Channel-wise Compression

Published: August 18, 2025 | arXiv ID: 2508.12984v1

By: Zehang Lin , Zheng Lin , Miao Yang and more

Potential Business Impact:

Makes AI learn faster on many devices.

The increasing complexity of neural networks poses a significant barrier to the deployment of distributed machine learning (ML) on resource-constrained devices, such as federated learning (FL). Split learning (SL) offers a promising solution by offloading the primary computing load from edge devices to a server via model partitioning. However, as the number of participating devices increases, the transmission of excessive smashed data (i.e., activations and gradients) becomes a major bottleneck for SL, slowing down the model training. To tackle this challenge, we propose a communication-efficient SL framework, named SL-ACC, which comprises two key components: adaptive channel importance identification (ACII) and channel grouping compression (CGC). ACII first identifies the contribution of each channel in the smashed data to model training using Shannon entropy. Following this, CGC groups the channels based on their entropy and performs group-wise adaptive compression to shrink the transmission volume without compromising training accuracy. Extensive experiments across various datasets validate that our proposed SL-ACC framework takes considerably less time to achieve a target accuracy than state-of-the-art benchmarks.

Country of Origin
🇭🇰 🇨🇳 Hong Kong, China

Page Count
6 pages

Category
Computer Science:
Machine Learning (CS)