Score: 2

HASFL: Heterogeneity-aware Split Federated Learning over Edge Computing Systems

Published: June 10, 2025 | arXiv ID: 2506.08426v1

By: Zheng Lin , Zhe Chen , Xianhao Chen and more

Potential Business Impact:

Helps slow phones learn faster together.

Business Areas:
Big Data Data and Analytics

Split federated learning (SFL) has emerged as a promising paradigm to democratize machine learning (ML) on edge devices by enabling layer-wise model partitioning. However, existing SFL approaches suffer significantly from the straggler effect due to the heterogeneous capabilities of edge devices. To address the fundamental challenge, we propose adaptively controlling batch sizes (BSs) and model splitting (MS) for edge devices to overcome resource heterogeneity. We first derive a tight convergence bound of SFL that quantifies the impact of varied BSs and MS on learning performance. Based on the convergence bound, we propose HASFL, a heterogeneity-aware SFL framework capable of adaptively controlling BS and MS to balance communication-computing latency and training convergence in heterogeneous edge networks. Extensive experiments with various datasets validate the effectiveness of HASFL and demonstrate its superiority over state-of-the-art benchmarks.

Country of Origin
🇭🇰 🇨🇳 Hong Kong, China

Page Count
16 pages

Category
Computer Science:
Machine Learning (CS)