Differentially Private Learning of Exponential Distributions: Adaptive Algorithms and Tight Bounds
By: Bar Mahpud, Or Sheffet
Potential Business Impact:
Learns private data patterns without revealing secrets.
We study the problem of learning exponential distributions under differential privacy. Given $n$ i.i.d.\ samples from $\mathrm{Exp}(\lambda)$, the goal is to privately estimate $\lambda$ so that the learned distribution is close in total variation distance to the truth. We present two complementary pure DP algorithms: one adapts the classical maximum likelihood estimator via clipping and Laplace noise, while the other leverages the fact that the $(1-1/e)$-quantile equals $1/\lambda$. Each method excels in a different regime, and we combine them into an adaptive best-of-both algorithm achieving near-optimal sample complexity for all $\lambda$. We further extend our approach to Pareto distributions via a logarithmic reduction, prove nearly matching lower bounds using packing and group privacy \cite{Karwa2017FiniteSD}, and show how approximate $(\epsilon,\delta)$-DP removes the need for externally supplied bounds. Together, these results give the first tight characterization of exponential distribution learning under DP and illustrate the power of adaptive strategies for heavy-tailed laws.
Similar Papers
High-Probability Bounds For Heterogeneous Local Differential Privacy
Machine Learning (Stat)
Protects your private info while still getting useful data.
Rényi Differential Privacy for Heavy-Tailed SDEs via Fractional Poincaré Inequalities
Machine Learning (Stat)
Makes private computer learning work better.
Privately Estimating Black-Box Statistics
Cryptography and Security
Protects private data when using unknown computer programs.