Dual-View Inference Attack: Machine Unlearning Amplifies Privacy Exposure
By: Lulu Xue , Shengshan Hu , Linqiang Qian and more
Potential Business Impact:
Makes AI forget data, but can reveal other secrets.
Machine unlearning is a newly popularized technique for removing specific training data from a trained model, enabling it to comply with data deletion requests. While it protects the rights of users requesting unlearning, it also introduces new privacy risks. Prior works have primarily focused on the privacy of data that has been unlearned, while the risks to retained data remain largely unexplored. To address this gap, we focus on the privacy risks of retained data and, for the first time, reveal the vulnerabilities introduced by machine unlearning under the dual-view setting, where an adversary can query both the original and the unlearned models. From an information-theoretic perspective, we introduce the concept of {privacy knowledge gain} and demonstrate that the dual-view setting allows adversaries to obtain more information than querying either model alone, thereby amplifying privacy leakage. To effectively demonstrate this threat, we propose DVIA, a Dual-View Inference Attack, which extracts membership information on retained data using black-box queries to both models. DVIA eliminates the need to train an attack model and employs a lightweight likelihood ratio inference module for efficient inference. Experiments across different datasets and model architectures validate the effectiveness of DVIA and highlight the privacy risks inherent in the dual-view setting.
Similar Papers
Evaluating the Defense Potential of Machine Unlearning against Membership Inference Attacks
Cryptography and Security
Makes AI forget data, but still vulnerable.
Evaluating the Defense Potential of Machine Unlearning against Membership Inference Attacks
Cryptography and Security
Makes AI forget private data, but still vulnerable.
Auditing Approximate Machine Unlearning for Differentially Private Models
Machine Learning (CS)
Protects secrets in computers even after removing data.