DS FedProxGrad: Asymptotic Stationarity Without Noise Floor in Fair Federated Learning
By: Huzaifa Arif
Recent work \cite{arifgroup} introduced Federated Proximal Gradient \textbf{(\texttt{FedProxGrad})} for solving non-convex composite optimization problems in group fair federated learning. However, the original analysis established convergence only to a \textit{noise-dominated neighborhood of stationarity}, with explicit dependence on a variance-induced noise floor. In this work, we provide an improved asymptotic convergence analysis for a generalized \texttt{FedProxGrad}-type analytical framework with inexact local proximal solutions and explicit fairness regularization. We call this extended analytical framework \textbf{DS \texttt{FedProxGrad}} (Decay Step Size \texttt{FedProxGrad}). Under a Robbins-Monro step-size schedule \cite{robbins1951stochastic} and a mild decay condition on local inexactness, we prove that $\liminf_{r\to\infty} \mathbb{E}[\|\nabla F(\mathbf{x}^r)\|^2] = 0$, i.e., the algorithm is asymptotically stationary and the convergence rate does not depend on a variance-induced noise floor.
Similar Papers
DS FedProxGrad: Asymptotic Stationarity Without Noise Floor in Fair Federated Learning
Machine Learning (CS)
Makes AI fairer by improving how it learns.
Federated Stochastic Minimax Optimization under Heavy-Tailed Noises
Machine Learning (CS)
Helps computers learn better with messy data.
Sharp Gaussian approximations for Decentralized Federated Learning
Machine Learning (Stat)
Detects computer attacks by watching how they learn.