GeoClip: Geometry-Aware Clipping for Differentially Private SGD
By: Atefeh Gilani , Naima Tasnim , Lalitha Sankar and more
Potential Business Impact:
Makes private AI smarter by understanding data shapes.
Differentially private stochastic gradient descent (DP-SGD) is the most widely used method for training machine learning models with provable privacy guarantees. A key challenge in DP-SGD is setting the per-sample gradient clipping threshold, which significantly affects the trade-off between privacy and utility. While recent adaptive methods improve performance by adjusting this threshold during training, they operate in the standard coordinate system and fail to account for correlations across the coordinates of the gradient. We propose GeoClip, a geometry-aware framework that clips and perturbs gradients in a transformed basis aligned with the geometry of the gradient distribution. GeoClip adaptively estimates this transformation using only previously released noisy gradients, incurring no additional privacy cost. We provide convergence guarantees for GeoClip and derive a closed-form solution for the optimal transformation that minimizes the amount of noise added while keeping the probability of gradient clipping under control. Experiments on both tabular and image datasets demonstrate that GeoClip consistently outperforms existing adaptive clipping methods under the same privacy budget.
Similar Papers
Differentially Private Clipped-SGD: High-Probability Convergence with Arbitrary Clipping Level
Machine Learning (CS)
Makes AI learn better with privacy.
Technical Report: Full Version of Analyzing and Optimizing Perturbation of DP-SGD Geometrically
Machine Learning (CS)
Makes private data training more accurate.
Mitigating Disparate Impact of Differentially Private Learning through Bounded Adaptive Clipping
Machine Learning (CS)
Protects privacy without hurting fairness for all.