Privacy Guarantee for Nash Equilibrium Computation of Aggregative Games Based on Pointwise Maximal Leakage
By: Zhaoyang Cheng , Guanpu Chen , Tobias J. Oechtering and more
Potential Business Impact:
Protects secrets better than old methods.
Privacy preservation has served as a key metric in designing Nash equilibrium (NE) computation algorithms. Although differential privacy (DP) has been widely employed for privacy guarantees, it does not exploit prior distributional knowledge of datasets and is ineffective in assessing information leakage for correlated datasets. To address these concerns, we establish a pointwise maximal leakage (PML) framework when computing NE in aggregative games. By incorporating prior knowledge of players' cost function datasets, we obtain a precise and computable upper bound of privacy leakage with PML guarantees. In the entire view, we show PML refines DP by offering a tighter privacy guarantee, enabling flexibility in designing NE computation. Also, in the individual view, we reveal that the lower bound of PML can exceed the upper bound of DP by constructing specific correlated datasets. The results emphasize that PML is a more proper privacy measure than DP since the latter fails to adequately capture privacy leakage in correlated datasets. Moreover, we conduct experiments with adversaries who attempt to infer players' private information to illustrate the effectiveness of our framework.
Similar Papers
Evaluating Differential Privacy on Correlated Datasets Using Pointwise Maximal Leakage
Cryptography and Security
Makes private data less safe with linked information.
Privacy Mechanism Design based on Empirical Distributions
Cryptography and Security
Protects private data even when its source is unknown.
Context-aware Privacy Bounds for Linear Queries
Information Theory
Makes private data sharing safer with less guessing.