Adaptive Deadline and Batch Layered Synchronized Federated Learning
By: Asaf Goren , Natalie Lang , Nir Shlezinger and more
Potential Business Impact:
Trains AI faster on many phones without sharing data.
Federated learning (FL) enables collaborative model training across distributed edge devices while preserving data privacy, and typically operates in a round-based synchronous manner. However, synchronous FL suffers from latency bottlenecks due to device heterogeneity, where slower clients (stragglers) delay or degrade global updates. Prior solutions, such as fixed deadlines, client selection, and layer-wise partial aggregation, alleviate the effect of stragglers, but treat round timing and local workload as static parameters, limiting their effectiveness under strict time constraints. We propose ADEL-FL, a novel framework that jointly optimizes per-round deadlines and user-specific batch sizes for layer-wise aggregation. Our approach formulates a constrained optimization problem minimizing the expected L2 distance to the global optimum under total training time and global rounds. We provide a convergence analysis under exponential compute models and prove that ADEL-FL yields unbiased updates with bounded variance. Extensive experiments demonstrate that ADEL-FL outperforms alternative methods in both convergence rate and final accuracy under heterogeneous conditions.
Similar Papers
Optimal Batch-Size Control for Low-Latency Federated Learning with Device Heterogeneity
Machine Learning (CS)
Makes smart devices learn faster, privately.
Optimal Batch-Size Control for Low-Latency Federated Learning with Device Heterogeneity
Machine Learning (CS)
Makes smart devices learn faster, privately.
Adaptive Biased User Scheduling for Heterogeneous Wireless Federate Learning Network
Systems and Control
Trains computers faster even with slow connections.