Source Anonymity for Private Random Walk Decentralized Learning
By: Maximilian Egger , Svenja Lage , Rawad Bitar and more
Potential Business Impact:
Keeps learning private by hiding who shares.
This paper considers random walk-based decentralized learning, where at each iteration of the learning process, one user updates the model and sends it to a randomly chosen neighbor until a convergence criterion is met. Preserving data privacy is a central concern and open problem in decentralized learning. We propose a privacy-preserving algorithm based on public-key cryptography and anonymization. In this algorithm, the user updates the model and encrypts the result using a distant user's public key. The encrypted result is then transmitted through the network with the goal of reaching that specific user. The key idea is to hide the source's identity so that, when the destination user decrypts the result, it does not know who the source was. The challenge is to design a network-dependent probability distribution (at the source) over the potential destinations such that, from the receiver's perspective, all users have a similar likelihood of being the source. We introduce the problem and construct a scheme that provides anonymity with theoretical guarantees. We focus on random regular graphs to establish rigorous guarantees.
Similar Papers
Decentralized Optimization with Amplified Privacy via Efficient Communication
Systems and Control
Keeps secret messages safe while learning.
Differentially-Private Decentralized Learning in Heterogeneous Multicast Networks
Information Theory
Keeps your private data safe while learning.
Leveraging Randomness in Model and Data Partitioning for Privacy Amplification
Machine Learning (CS)
Keeps your private data safe when training computers.