First Provable Guarantees for Practical Private FL: Beyond Restrictive Assumptions
By: Egor Shulgin , Grigory Malinovsky , Sarit Khirirat and more
Federated Learning (FL) enables collaborative training on decentralized data. Differential privacy (DP) is crucial for FL, but current private methods often rely on unrealistic assumptions (e.g., bounded gradients or heterogeneity), hindering practical application. Existing works that relax these assumptions typically neglect practical FL features, including multiple local updates and partial client participation. We introduce Fed-$α$-NormEC, the first differentially private FL framework providing provable convergence and DP guarantees under standard assumptions while fully supporting these practical features. Fed-$α$-NormE integrates local updates (full and incremental gradient steps), separate server and client stepsizes, and, crucially, partial client participation, which is essential for real-world deployment and vital for privacy amplification. Our theoretical guarantees are corroborated by experiments on private deep learning tasks.
Similar Papers
Mitigating Privacy-Utility Trade-off in Decentralized Federated Learning via $f$-Differential Privacy
Machine Learning (CS)
Keeps private data safe when learning together.
Inclusive, Differentially Private Federated Learning for Clinical Data
Machine Learning (CS)
Helps hospitals share patient data safely for better health.
An Interactive Framework for Implementing Privacy-Preserving Federated Learning: Experiments on Large Language Models
Machine Learning (CS)
Protects private data while training smart computer programs.