Auditing Differential Privacy in the Black-Box Setting
By: Kaining Shi, Cong Ma
Potential Business Impact:
Protects private data when computers check it.
This paper introduces a novel theoretical framework for auditing differential privacy (DP) in a black-box setting. Leveraging the concept of $f$-differential privacy, we explicitly define type I and type II errors and propose an auditing mechanism based on conformal inference. Our approach robustly controls the type I error rate under minimal assumptions. Furthermore, we establish a fundamental impossibility result, demonstrating the inherent difficulty of simultaneously controlling both type I and type II errors without additional assumptions. Nevertheless, under a monotone likelihood ratio (MLR) assumption, our auditing mechanism effectively controls both errors. We also extend our method to construct valid confidence bands for the trade-off function in the finite-sample regime.
Similar Papers
Sequentially Auditing Differential Privacy
Cryptography and Security
Checks if private data stays secret faster.
Monitoring Violations of Differential Privacy over Time
Cryptography and Security
Keeps private information safe as apps update.
Enhancing One-run Privacy Auditing with Quantile Regression-Based Membership Inference
Machine Learning (CS)
Checks computer privacy without needing many tries.