Cooperative Local Differential Privacy: Securing Time Series Data in Distributed Environments
By: Bikash Chandra Singh , Md Jakir Hossain , Rafael Diaz and more
Potential Business Impact:
Keeps your personal data private when shared.
The rapid growth of smart devices such as phones, wearables, IoT sensors, and connected vehicles has led to an explosion of continuous time series data that offers valuable insights in healthcare, transportation, and more. However, this surge raises significant privacy concerns, as sensitive patterns can reveal personal details. While traditional differential privacy (DP) relies on trusted servers, local differential privacy (LDP) enables users to perturb their own data. However, traditional LDP methods perturb time series data by adding user-specific noise but exhibit vulnerabilities. For instance, noise applied within fixed time windows can be canceled during aggregation (e.g., averaging), enabling adversaries to infer individual statistics over time, thereby eroding privacy guarantees. To address these issues, we introduce a Cooperative Local Differential Privacy (CLDP) mechanism that enhances privacy by distributing noise vectors across multiple users. In our approach, noise is collaboratively generated and assigned so that when all users' perturbed data is aggregated, the noise cancels out preserving overall statistical properties while protecting individual privacy. This cooperative strategy not only counters vulnerabilities inherent in time-window-based methods but also scales effectively for large, real-time datasets, striking a better balance between data utility and privacy in multiuser environments.
Similar Papers
Local Distance Query with Differential Privacy
Cryptography and Security
Keeps your online map private when finding directions.
Practical Implications of Implementing Local Differential Privacy for Smart grids
Cryptography and Security
Keeps your power use private from the electric company.
Local Differential Privacy for Federated Learning with Fixed Memory Usage and Per-Client Privacy
Cryptography and Security
Keeps private data safe while training AI.