Score: 0

Context-aware Privacy Bounds for Linear Queries

Published: January 6, 2026 | arXiv ID: 2601.02855v1

By: Heng Zhao, Sara Saeidian, Tobias J. Oechtering

Potential Business Impact:

Makes private data sharing safer with less guessing.

Business Areas:
Privacy Privacy and Security

Linear queries, as the basis of broad analysis tasks, are often released through privacy mechanisms based on differential privacy (DP), the most popular framework for privacy protection. However, DP adopts a context-free definition that operates independently of the data-generating distribution. In this paper, we revisit the privacy analysis of the Laplace mechanism through the lens of pointwise maximal leakage (PML). We demonstrate that the distribution-agnostic definition of the DP framework often mandates excessive noise. To address this, we incorporate an assumption about the prior distribution by lower-bounding the probability of any single record belonging to any specific class. With this assumption, we derive a tight, context-aware leakage bound for general linear queries, and prove that our derived bound is strictly tighter than the standard DP guarantee and converges to the DP guarantee as this probability lower bound approaches zero. Numerical evaluations demonstrate that by exploiting this prior knowledge, the required noise scale can be reduced while maintaining privacy guarantees.

Country of Origin
πŸ‡ΈπŸ‡ͺ Sweden

Page Count
8 pages

Category
Computer Science:
Information Theory