Differentially private ratio statistics
By: Tomer Shoham, Katrina Ligettt
Potential Business Impact:
Protects private data when analyzing risks.
Ratio statistics--such as relative risk and odds ratios--play a central role in hypothesis testing, model evaluation, and decision-making across many areas of machine learning, including causal inference and fairness analysis. However, despite privacy concerns surrounding many datasets and despite increasing adoption of differential privacy, differentially private ratio statistics have largely been neglected by the literature and have only recently received an initial treatment by Lin et al. [1]. This paper attempts to fill this lacuna, giving results that can guide practice in evaluating ratios when the results must be protected by differential privacy. In particular, we show that even a simple algorithm can provide excellent properties concerning privacy, sample accuracy, and bias, not just asymptotically but also at quite small sample sizes. Additionally, we analyze a differentially private estimator for relative risk, prove its consistency, and develop a method for constructing valid confidence intervals. Our approach bridges a gap in the differential privacy literature and provides a practical solution for ratio estimation in private machine learning pipelines.
Similar Papers
Optimal Differentially Private Ranking from Pairwise Comparisons
Statistics Theory
Keeps your private choices secret when ranking things.
Statistical Privacy
Cryptography and Security
Protects your data even if hackers know how it's made.
Model Agnostic Differentially Private Causal Inference
Machine Learning (CS)
Lets us learn from private health data safely.