Score: 0

Setting $\varepsilon$ is not the Issue in Differential Privacy

Published: November 9, 2025 | arXiv ID: 2511.06305v1

By: Edwige Cyffers

Potential Business Impact:

Makes privacy protection in computers easier to understand.

Business Areas:
Privacy Privacy and Security

This position paper argues that setting the privacy budget in differential privacy should not be viewed as an important limitation of differential privacy compared to alternative methods for privacy-preserving machine learning. The so-called problem of interpreting the privacy budget is often presented as a major hindrance to the wider adoption of differential privacy in real-world deployments and is sometimes used to promote alternative mitigation techniques for data protection. We believe this misleads decision-makers into choosing unsafe methods. We argue that the difficulty in interpreting privacy budgets does not stem from the definition of differential privacy itself, but from the intrinsic difficulty of estimating privacy risks in context, a challenge that any rigorous method for privacy risk assessment face. Moreover, we claim that any sound method for estimating privacy risks should, given the current state of research, be expressible within the differential privacy framework or justify why it cannot.

Country of Origin
🇦🇹 Austria

Page Count
13 pages

Category
Computer Science:
Cryptography and Security