Differentially-Private Decentralized Learning in Heterogeneous Multicast Networks
By: Amir Ziaeddini, Yauhen Yakimenka, Jörg Kliewer
Potential Business Impact:
Keeps your private data safe while learning.
We propose a power-controlled differentially private decentralized learning algorithm designed for a set of clients aiming to collaboratively train a common learning model. The network is characterized by a row-stochastic adjacency matrix, which reflects different channel gains between the clients. In our privacy-preserving approach, both the transmit power for model updates and the level of injected Gaussian noise are jointly controlled to satisfy a given privacy and energy budget. We show that our proposed algorithm achieves a convergence rate of O(log T), where T is the horizon bound in the regret function. Furthermore, our numerical results confirm that our proposed algorithm outperforms existing works.
Similar Papers
Decentralized Optimization with Amplified Privacy via Efficient Communication
Systems and Control
Keeps secret messages safe while learning.
Locally Differentially Private Graph Clustering via the Power Iteration Method
Data Structures and Algorithms
Groups online friends while keeping secrets safe.
Source Anonymity for Private Random Walk Decentralized Learning
Cryptography and Security
Keeps learning private by hiding who shares.