Context-aware Privacy Bounds for Linear Queries
By: Heng Zhao, Sara Saeidian, Tobias J. Oechtering
Potential Business Impact:
Makes private data sharing safer with less guessing.
Linear queries, as the basis of broad analysis tasks, are often released through privacy mechanisms based on differential privacy (DP), the most popular framework for privacy protection. However, DP adopts a context-free definition that operates independently of the data-generating distribution. In this paper, we revisit the privacy analysis of the Laplace mechanism through the lens of pointwise maximal leakage (PML). We demonstrate that the distribution-agnostic definition of the DP framework often mandates excessive noise. To address this, we incorporate an assumption about the prior distribution by lower-bounding the probability of any single record belonging to any specific class. With this assumption, we derive a tight, context-aware leakage bound for general linear queries, and prove that our derived bound is strictly tighter than the standard DP guarantee and converges to the DP guarantee as this probability lower bound approaches zero. Numerical evaluations demonstrate that by exploiting this prior knowledge, the required noise scale can be reduced while maintaining privacy guarantees.
Similar Papers
A Tight Context-aware Privacy Bound for Histogram Publication
Cryptography and Security
Protects your data better when it's shared.
Privacy Mechanism Design based on Empirical Distributions
Cryptography and Security
Protects private data even when its source is unknown.
Privacy protection under the exposure of systems' prior information
Systems and Control
Keeps secret information safe from spies.