Differential Privacy: Gradient Leakage Attacks in Federated Learning Environments
By: Miguel Fernandez-de-Retana , Unai Zulaika , Rubén Sánchez-Corcuera and more
Potential Business Impact:
Protects private data when computers learn together.
Federated Learning (FL) allows for the training of Machine Learning models in a collaborative manner without the need to share sensitive data. However, it remains vulnerable to Gradient Leakage Attacks (GLAs), which can reveal private information from the shared model updates. In this work, we investigate the effectiveness of Differential Privacy (DP) mechanisms - specifically, DP-SGD and a variant based on explicit regularization (PDP-SGD) - as defenses against GLAs. To this end, we evaluate the performance of several computer vision models trained under varying privacy levels on a simple classification task, and then analyze the quality of private data reconstructions obtained from the intercepted gradients in a simulated FL environment. Our results demonstrate that DP-SGD significantly mitigates the risk of gradient leakage attacks, albeit with a moderate trade-off in model utility. In contrast, PDP-SGD maintains strong classification performance but proves ineffective as a practical defense against reconstruction attacks. These findings highlight the importance of empirically evaluating privacy mechanisms beyond their theoretical guarantees, particularly in distributed learning scenarios where information leakage may represent an unassumable critical threat to data security and privacy.
Similar Papers
Privacy-Preserving Decentralized Federated Learning via Explainable Adaptive Differential Privacy
Cryptography and Security
Keeps private data safe while learning.
Mitigating Privacy-Utility Trade-off in Decentralized Federated Learning via $f$-Differential Privacy
Machine Learning (CS)
Keeps private data safe when learning together.
Towards Understanding Generalization in DP-GD: A Case Study in Training Two-Layer CNNs
Machine Learning (Stat)
Keeps private data safe while computers learn.