Privacy in Federated Learning with Spiking Neural Networks
By: Dogukan Aksu, Jesus Martinez del Rincon, Ihsen Alouani
Potential Business Impact:
Keeps private data safe when computers learn.
Spiking neural networks (SNNs) have emerged as prominent candidates for embedded and edge AI. Their inherent low power consumption makes them far more efficient than conventional ANNs in scenarios where energy budgets are tightly constrained. In parallel, federated learning (FL) has become the prevailing training paradigm in such settings, enabling on-device learning while limiting the exposure of raw data. However, gradient inversion attacks represent a critical privacy threat in FL, where sensitive training data can be reconstructed directly from shared gradients. While this vulnerability has been widely investigated in conventional ANNs, its implications for SNNs remain largely unexplored. In this work, we present the first comprehensive empirical study of gradient leakage in SNNs across diverse data domains. SNNs are inherently non-differentiable and are typically trained using surrogate gradients, which we hypothesized would be less correlated with the original input and thus less informative from a privacy perspective. To investigate this, we adapt different gradient leakage attacks to the spike domain. Our experiments reveal a striking contrast with conventional ANNs: whereas ANN gradients reliably expose salient input content, SNN gradients yield noisy, temporally inconsistent reconstructions that fail to recover meaningful spatial or temporal structure. These results indicate that the combination of event-driven dynamics and surrogate-gradient training substantially reduces gradient informativeness. To the best of our knowledge, this work provides the first systematic benchmark of gradient inversion attacks for spiking architectures, highlighting the inherent privacy-preserving potential of neuromorphic computation.
Similar Papers
Accuracy-Robustness Trade Off via Spiking Neural Network Gradient Sparsity Trail
Neural and Evolutionary Computing
Makes computer brains tougher against tricks.
Hybrid Layer-Wise ANN-SNN With Surrogate Spike Encoding-Decoding Structure
Neural and Evolutionary Computing
Makes smart computers use less power.
Spiking Neural Networks: The Future of Brain-Inspired Computing
Neural and Evolutionary Computing
Makes computers use less power to think.