Computation-aware Energy-harvesting Federated Learning: Cyclic Scheduling with Selective Participation
By: Eunjeong Jeong, Nikolaos Pappas
Potential Business Impact:
Saves phone battery by smarter training.
Federated Learning (FL) is a powerful paradigm for distributed learning, but its increasing complexity leads to significant energy consumption from client-side computations for training models. In particular, the challenge is critical in energy-harvesting FL (EHFL) systems where participation availability of each device oscillates due to limited energy. To address this, we propose FedBacys, a battery-aware EHFL framework using cyclic client participation based on users' battery levels. By clustering clients and scheduling them sequentially, FedBacys minimizes redundant computations, reduces system-wide energy usage, and improves learning stability. We also introduce FedBacys-Odd, a more energy-efficient variant that allows clients to participate selectively, further reducing energy costs without compromising performance. We provide a convergence analysis for our framework and demonstrate its superior energy efficiency and robustness compared to existing algorithms through numerical experiments.
Similar Papers
Battery-aware Cyclic Scheduling in Energy-harvesting Federated Learning
Machine Learning (CS)
Saves phone battery for smarter AI learning.
Feature-Based Semantics-Aware Scheduling for Energy-Harvesting Federated Learning
Machine Learning (CS)
Smartly trains AI on phones, saving energy.
Federated Learning within Global Energy Budget over Heterogeneous Edge Accelerators
Distributed, Parallel, and Cluster Computing
Trains AI smarter with less energy.