Score: 0

Cooperative Local Differential Privacy: Securing Time Series Data in Distributed Environments

Published: November 12, 2025 | arXiv ID: 2511.09696v1

By: Bikash Chandra Singh , Md Jakir Hossain , Rafael Diaz and more

Potential Business Impact:

Keeps your personal data private when shared.

Business Areas:
Cloud Security Information Technology, Privacy and Security

The rapid growth of smart devices such as phones, wearables, IoT sensors, and connected vehicles has led to an explosion of continuous time series data that offers valuable insights in healthcare, transportation, and more. However, this surge raises significant privacy concerns, as sensitive patterns can reveal personal details. While traditional differential privacy (DP) relies on trusted servers, local differential privacy (LDP) enables users to perturb their own data. However, traditional LDP methods perturb time series data by adding user-specific noise but exhibit vulnerabilities. For instance, noise applied within fixed time windows can be canceled during aggregation (e.g., averaging), enabling adversaries to infer individual statistics over time, thereby eroding privacy guarantees. To address these issues, we introduce a Cooperative Local Differential Privacy (CLDP) mechanism that enhances privacy by distributing noise vectors across multiple users. In our approach, noise is collaboratively generated and assigned so that when all users' perturbed data is aggregated, the noise cancels out preserving overall statistical properties while protecting individual privacy. This cooperative strategy not only counters vulnerabilities inherent in time-window-based methods but also scales effectively for large, real-time datasets, striking a better balance between data utility and privacy in multiuser environments.

Country of Origin
🇺🇸 United States

Page Count
8 pages

Category
Computer Science:
Cryptography and Security