Optimizing Noise Distributions for Differential Privacy
By: Atefeh Gilani , Juan Felipe Gomez , Shahab Asoodeh and more
Potential Business Impact:
Protects private data better while sharing it.
We propose a unified optimization framework for designing continuous and discrete noise distributions that ensure differential privacy (DP) by minimizing R\'enyi DP, a variant of DP, under a cost constraint. R\'enyi DP has the advantage that by considering different values of the R\'enyi parameter $\alpha$, we can tailor our optimization for any number of compositions. To solve the optimization problem, we reduce it to a finite-dimensional convex formulation and perform preconditioned gradient descent. The resulting noise distributions are then compared to their Gaussian and Laplace counterparts. Numerical results demonstrate that our optimized distributions are consistently better, with significant improvements in $(\varepsilon, \delta)$-DP guarantees in the moderate composition regimes, compared to Gaussian and Laplace distributions with the same variance.
Similar Papers
Spectral Graph Clustering under Differential Privacy: Balancing Privacy, Accuracy, and Efficiency
Information Theory
Keeps online connections private while still working.
Infinitely Divisible Noise for Differential Privacy: Nearly Optimal Error in the High $\varepsilon$ Regime
Cryptography and Security
Keeps private data safe when shared.
Nearly Optimal Differentially Private ReLU Regression
Machine Learning (CS)
Protects private data while learning from it.