HASFL: Heterogeneity-aware Split Federated Learning over Edge Computing Systems
By: Zheng Lin , Zhe Chen , Xianhao Chen and more
Potential Business Impact:
Helps slow phones learn faster together.
Split federated learning (SFL) has emerged as a promising paradigm to democratize machine learning (ML) on edge devices by enabling layer-wise model partitioning. However, existing SFL approaches suffer significantly from the straggler effect due to the heterogeneous capabilities of edge devices. To address the fundamental challenge, we propose adaptively controlling batch sizes (BSs) and model splitting (MS) for edge devices to overcome resource heterogeneity. We first derive a tight convergence bound of SFL that quantifies the impact of varied BSs and MS on learning performance. Based on the convergence bound, we propose HASFL, a heterogeneity-aware SFL framework capable of adaptively controlling BS and MS to balance communication-computing latency and training convergence in heterogeneous edge networks. Extensive experiments with various datasets validate the effectiveness of HASFL and demonstrate its superiority over state-of-the-art benchmarks.
Similar Papers
Accelerating Wireless Distributed Learning via Hybrid Split and Federated Learning Optimization
Machine Learning (CS)
Makes smart devices learn faster together.
Enhancing Split Learning with Sharded and Blockchain-Enabled SplitFed Approaches
Distributed, Parallel, and Cluster Computing
Makes AI learn safely from many computers.
Resource-Aware Aggregation and Sparsification in Heterogeneous Ensemble Federated Learning
Machine Learning (CS)
Helps many computers train together without sharing secrets.