Fine-grained Manipulation Attacks to Local Differential Privacy Protocols for Data Streams
By: Xinyu Li , Xuebin Ren , Shusen Yang and more
Potential Business Impact:
Makes private data collection harder to trick.
Local Differential Privacy (LDP) enables massive data collection and analysis while protecting end users' privacy against untrusted aggregators. It has been applied to various data types (e.g., categorical, numerical, and graph data) and application settings (e.g., static and streaming). Recent findings indicate that LDP protocols can be easily disrupted by poisoning or manipulation attacks, which leverage injected/corrupted fake users to send crafted data conforming to the LDP reports. However, current attacks primarily target static protocols, neglecting the security of LDP protocols in the streaming settings. Our research fills the gap by developing novel fine-grained manipulation attacks to LDP protocols for data streams. By reviewing the attack surfaces in existing algorithms, We introduce a unified attack framework with composable modules, which can manipulate the LDP estimated stream toward a target stream. Our attack framework can adapt to state-of-the-art streaming LDP algorithms with different analytic tasks (e.g., frequency and mean) and LDP models (event-level, user-level, w-event level). We validate our attacks theoretically and through extensive experiments on real-world datasets, and finally explore a possible defense mechanism for mitigating these attacks.
Similar Papers
Mitigating Data Poisoning Attacks to Local Differential Privacy
Cryptography and Security
Protects private data from bad guys trying to cheat.
Poisoning Attacks to Local Differential Privacy Protocols for Trajectory Data
Cryptography and Security
Makes fake location data fool privacy tools.
Data Poisoning Attacks to Locally Differentially Private Range Query Protocols
Cryptography and Security
Makes private data collection easier to trick.