Mitigating Privacy-Utility Trade-off in Decentralized Federated Learning via $f$-Differential Privacy
By: Xiang Li , Buxin Su , Chendi Wang and more
Potential Business Impact:
Keeps private data safe when learning together.
Differentially private (DP) decentralized Federated Learning (FL) allows local users to collaborate without sharing their data with a central server. However, accurately quantifying the privacy budget of private FL algorithms is challenging due to the co-existence of complex algorithmic components such as decentralized communication and local updates. This paper addresses privacy accounting for two decentralized FL algorithms within the $f$-differential privacy ($f$-DP) framework. We develop two new $f$-DP-based accounting methods tailored to decentralized settings: Pairwise Network $f$-DP (PN-$f$-DP), which quantifies privacy leakage between user pairs under random-walk communication, and Secret-based $f$-Local DP (Sec-$f$-LDP), which supports structured noise injection via shared secrets. By combining tools from $f$-DP theory and Markov chain concentration, our accounting framework captures privacy amplification arising from sparse communication, local iterations, and correlated noise. Experiments on synthetic and real datasets demonstrate that our methods yield consistently tighter $(\epsilon,\delta)$ bounds and improved utility compared to R\'enyi DP-based approaches, illustrating the benefits of $f$-DP in decentralized privacy accounting.
Similar Papers
Privacy-Preserving Decentralized Federated Learning via Explainable Adaptive Differential Privacy
Cryptography and Security
Keeps private data safe while learning.
Differentially private federated learning for localized control of infectious disease dynamics
Machine Learning (CS)
Helps predict disease spread without sharing private data.
Differential Privacy as a Perk: Federated Learning over Multiple-Access Fading Channels with a Multi-Antenna Base Station
Machine Learning (CS)
Keeps private data safe while learning.