Differential Privacy of Network Parameters from a System Identification Perspective
By: Andrew Campbell , Anna Scaglione , Hang Liu and more
Potential Business Impact:
Keeps secret data safe from spies during computer tests.
This paper addresses the problem of protecting network information from privacy system identification (SI) attacks when sharing cyber-physical system simulations. We model analyst observations of networked states as time-series outputs of a graph filter driven by differentially private (DP) nodal excitations, with the analyst aiming to infer the underlying graph shift operator (GSO). Unlike traditional SI, which estimates system parameters, we study the inverse problem: what assumptions prevent adversaries from identifying the GSO while preserving utility for legitimate analysis. We show that applying DP mechanisms to inputs provides formal privacy guarantees for the GSO, linking the $(\epsilon,\delta)$-DP bound to the spectral properties of the graph filter and noise covariance. More precisely, for DP Gaussian signals, the spectral characteristics of both the filter and noise covariance determine the privacy bound, with smooth filters and low-condition-number covariance yielding greater privacy.
Similar Papers
Spectral Graph Clustering under Differential Privacy: Balancing Privacy, Accuracy, and Efficiency
Information Theory
Keeps online connections private while still working.
Comparing privacy notions for protection against reconstruction attacks in machine learning
Machine Learning (CS)
Compares privacy methods for safer AI learning.
Interpreting Network Differential Privacy
Statistics Theory
Protects online privacy by fixing how data is shared.