Federated Smoothing ADMM for Localization
By: Reza Mirzaeifard , Ashkan Moradi , Masahiro Yukawa and more
Potential Business Impact:
Helps computers find lost things better.
This paper addresses the challenge of localization in federated settings, which are characterized by distributed data, non-convexity, and non-smoothness. To tackle the scalability and outlier issues inherent in such environments, we propose a robust algorithm that employs an $\ell_1$-norm formulation within a novel federated ADMM framework. This approach addresses the problem by integrating an iterative smooth approximation for the total variation consensus term and employing a Moreau envelope approximation for the convex function that appears in a subtracted form. This transformation ensures that the problem is smooth and weakly convex in each iteration, which results in enhanced computational efficiency and improved estimation accuracy. The proposed algorithm supports asynchronous updates and multiple client updates per iteration, which ensures its adaptability to real-world federated systems. To validate the reliability of the proposed algorithm, we show that the method converges to a stationary point, and numerical simulations highlight its superior performance in convergence speed and outlier resilience compared to existing state-of-the-art localization methods.
Similar Papers
Smoothing ADMM for Non-convex and Non-smooth Hierarchical Federated Learning
Machine Learning (CS)
Trains AI smarter and faster with different data.
A Linearized Alternating Direction Multiplier Method for Federated Matrix Completion Problems
Machine Learning (CS)
Helps apps guess what you like without seeing your data.
Federated ADMM from Bayesian Duality
Machine Learning (CS)
Makes AI learn better from many computers.