High-Probability Bounds For Heterogeneous Local Differential Privacy
By: Maryam Aliakbarpour , Alireza Fallah , Swaha Roy and more
Potential Business Impact:
Protects your private info while still getting useful data.
We study statistical estimation under local differential privacy (LDP) when users may hold heterogeneous privacy levels and accuracy must be guaranteed with high probability. Departing from the common in-expectation analyses, and for one-dimensional and multi-dimensional mean estimation problems, we develop finite sample upper bounds in $\ell_2$-norm that hold with probability at least $1-\beta$. We complement these results with matching minimax lower bounds, establishing the optimality (up to constants) of our guarantees in the heterogeneous LDP regime. We further study distribution learning in $\ell_\infty$-distance, designing an algorithm with high-probability guarantees under heterogeneous privacy demands. Our techniques offer principled guidance for designing mechanisms in settings with user-specific privacy levels.
Similar Papers
Fundamental Limit of Discrete Distribution Estimation under Utility-Optimized Local Differential Privacy
Cryptography and Security
Keeps private info safe while learning from data.
Differential privacy with dependent data
Machine Learning (Stat)
Protects private data when studying groups of people.
Differential privacy with dependent data
Machine Learning (Stat)
Protects private data while still using it for studies.