A Tight Context-aware Privacy Bound for Histogram Publication
By: Sara Saeidian , Ata Yavuzyılmaz , Leonhard Grosse and more
Potential Business Impact:
Protects your data better when it's shared.
We analyze the privacy guarantees of the Laplace mechanism releasing the histogram of a dataset through the lens of pointwise maximal leakage (PML). While differential privacy is commonly used to quantify the privacy loss, it is a context-free definition that does not depend on the data distribution. In contrast, PML enables a more refined analysis by incorporating assumptions about the data distribution. We show that when the probability of each histogram bin is bounded away from zero, stronger privacy protection can be achieved for a fixed level of noise. Our results demonstrate the advantage of context-aware privacy measures and show that incorporating assumptions about the data can improve privacy-utility tradeoffs.
Similar Papers
Context-aware Privacy Bounds for Linear Queries
Information Theory
Makes private data sharing safer with less guessing.
Privacy Mechanism Design based on Empirical Distributions
Cryptography and Security
Protects private data even when its source is unknown.
Privacy protection under the exposure of systems' prior information
Systems and Control
Keeps secret information safe from spies.