Tight Privacy Audit in One Run
By: Zihang Xiang , Tianhao Wang , Hanshen Xiao and more
Potential Business Impact:
Checks if private data stays private.
In this paper, we study the problem of privacy audit in one run and show that our method achieves tight audit results for various differentially private protocols. This includes obtaining tight results for auditing $(\varepsilon,\delta)$-DP algorithms where all previous work fails to achieve in any parameter setups. We first formulate a framework for privacy audit \textit{in one run} with refinement compared with previous work. Then, based on modeling privacy by the $f$-DP formulation, we study the implications of our framework to obtain a theoretically justified lower bound for privacy audit. In the experiment, we compare with previous work and show that our audit method outperforms the rest in auditing various differentially private algorithms. We also provide experiments that give contrasting conclusions to previous work on the parameter settings for privacy audits in one run.
Similar Papers
Privacy Audit as Bits Transmission: (Im)possibilities for Audit by One Run
Cryptography and Security
Checks if computer programs keep secrets safe.
How Well Can Differential Privacy Be Audited in One Run?
Machine Learning (CS)
Makes computer privacy checks faster and more accurate.
Sequentially Auditing Differential Privacy
Cryptography and Security
Checks if private data stays secret faster.