Decentralized Optimization with Amplified Privacy via Efficient Communication
By: Wei Huo , Changxin Liu , Kemi Ding and more
Potential Business Impact:
Keeps secret messages safe while learning.
Decentralized optimization is crucial for multi-agent systems, with significant concerns about communication efficiency and privacy. This paper explores the role of efficient communication in decentralized stochastic gradient descent algorithms for enhancing privacy preservation. We develop a novel algorithm that incorporates two key features: random agent activation and sparsified communication. Utilizing differential privacy, we demonstrate that these features reduce noise without sacrificing privacy, thereby amplifying the privacy guarantee and improving accuracy. Additionally, we analyze the convergence and the privacy-accuracy-communication trade-off of the proposed algorithm. Finally, we present experimental results to illustrate the effectiveness of our algorithm.
Similar Papers
Local Differential Privacy for Distributed Stochastic Aggregative Optimization with Guaranteed Optimality
Systems and Control
Lets computers learn together privately and accurately.
Distributed Stochastic Zeroth-Order Optimization with Compressed Communication
Optimization and Control
Helps computers learn without seeing all the data.
Enhancing Privacy in Decentralized Min-Max Optimization: A Differentially Private Approach
Machine Learning (CS)
Protects private data in group learning.