Sequentially Auditing Differential Privacy
By: Tomás González , Mateo Dulce-Rubio , Aaditya Ramdas and more
Potential Business Impact:
Checks if private data stays secret faster.
We propose a practical sequential test for auditing differential privacy guarantees of black-box mechanisms. The test processes streams of mechanisms' outputs providing anytime-valid inference while controlling Type I error, overcoming the fixed sample size limitation of previous batch auditing methods. Experiments show this test detects violations with sample sizes that are orders of magnitude smaller than existing methods, reducing this number from 50K to a few hundred examples, across diverse realistic mechanisms. Notably, it identifies DP-SGD privacy violations in \textit{under} one training run, unlike prior methods needing full model training.
Similar Papers
Auditing Differential Privacy in the Black-Box Setting
Methodology
Protects private data when computers check it.
Monitoring Violations of Differential Privacy over Time
Cryptography and Security
Keeps private information safe as apps update.
Observational Auditing of Label Privacy
Machine Learning (CS)
Checks computer privacy without changing data.