Score: 0

Correlating Cross-Iteration Noise for DP-SGD using Model Curvature

Published: October 6, 2025 | arXiv ID: 2510.05416v1

By: Xin Gu , Yingtai Xiao , Guanlin He and more

Potential Business Impact:

Makes AI smarter while keeping data private.

Business Areas:
Darknet Internet Services

Differentially private stochastic gradient descent (DP-SGD) offers the promise of training deep learning models while mitigating many privacy risks. However, there is currently a large accuracy gap between DP-SGD and normal SGD training. This has resulted in different lines of research investigating orthogonal ways of improving privacy-preserving training. One such line of work, known as DP-MF, correlates the privacy noise across different iterations of stochastic gradient descent -- allowing later iterations to cancel out some of the noise added to earlier iterations. In this paper, we study how to improve this noise correlation. We propose a technique called NoiseCurve that uses model curvature, estimated from public unlabeled data, to improve the quality of this cross-iteration noise correlation. Our experiments on various datasets, models, and privacy parameters show that the noise correlations computed by NoiseCurve offer consistent and significant improvements in accuracy over the correlation scheme used by DP-MF.

Country of Origin
πŸ‡ΊπŸ‡Έ United States

Page Count
17 pages

Category
Computer Science:
Machine Learning (CS)