A Randomized Zeroth-Order Hierarchical Framework for Heterogeneous Federated Learning
By: Yuyang Qiu, Kibaek Kim, Farzad Yousefian
Potential Business Impact:
Helps computers learn better from different data.
Heterogeneity in federated learning (FL) is a critical and challenging aspect that significantly impacts model performance and convergence. In this paper, we propose a novel framework by formulating heterogeneous FL as a hierarchical optimization problem. This new framework captures both local and global training processes through a bilevel formulation and is capable of the following: (i) addressing client heterogeneity through a personalized learning framework; (ii) capturing the pre-training process on the server side; (iii) updating the global model through nonstandard aggregation; (iv) allowing for nonidentical local steps; and (v) capturing clients' local constraints. We design and analyze an implicit zeroth-order FL method (ZO-HFL), equipped with nonasymptotic convergence guarantees for both the server-agent and the individual client-agents, and asymptotic guarantees for both the server-agent and client-agents in an almost sure sense. Notably, our method does not rely on standard assumptions in heterogeneous FL, such as the bounded gradient dissimilarity condition. We implement our method on image classification tasks and compare with other methods under different heterogeneous settings.
Similar Papers
Client-Centric Federated Adaptive Optimization
Machine Learning (CS)
Helps computers learn together without sharing private data.
Data-Free Black-Box Federated Learning via Zeroth-Order Gradient Estimation
Machine Learning (CS)
Lets computers learn together without sharing secrets.
Client Selection in Federated Learning with Data Heterogeneity and Network Latencies
Machine Learning (CS)
Makes smart computers learn faster from different data.