Correlating Cross-Iteration Noise for DP-SGD using Model Curvature
By: Xin Gu , Yingtai Xiao , Guanlin He and more
Potential Business Impact:
Makes AI smarter while keeping data private.
Differentially private stochastic gradient descent (DP-SGD) offers the promise of training deep learning models while mitigating many privacy risks. However, there is currently a large accuracy gap between DP-SGD and normal SGD training. This has resulted in different lines of research investigating orthogonal ways of improving privacy-preserving training. One such line of work, known as DP-MF, correlates the privacy noise across different iterations of stochastic gradient descent -- allowing later iterations to cancel out some of the noise added to earlier iterations. In this paper, we study how to improve this noise correlation. We propose a technique called NoiseCurve that uses model curvature, estimated from public unlabeled data, to improve the quality of this cross-iteration noise correlation. Our experiments on various datasets, models, and privacy parameters show that the noise correlations computed by NoiseCurve offer consistent and significant improvements in accuracy over the correlation scheme used by DP-MF.
Similar Papers
Correlated Noise Mechanisms for Differentially Private Learning
Machine Learning (CS)
Makes AI learn without seeing private data.
Optimizing Privacy-Utility Trade-off in Decentralized Learning with Generalized Correlated Noise
Machine Learning (CS)
Keeps private data safe while learning together.
Technical Report: Full Version of Analyzing and Optimizing Perturbation of DP-SGD Geometrically
Machine Learning (CS)
Makes private data training more accurate.