Enhancing Gradient Variance and Differential Privacy in Quantum Federated Learning
By: Duc-Thien Phan , Minh-Duong Nguyen , Quoc-Viet Pham and more
Potential Business Impact:
Makes AI learn better and safer with less noise.
Upon integrating Quantum Neural Network (QNN) as the local model, Quantum Federated Learning (QFL) has recently confronted notable challenges. Firstly, exploration is hindered over sharp minima, decreasing learning performance. Secondly, the steady gradient descent results in more stable and predictable model transmissions over wireless channels, making the model more susceptible to attacks from adversarial entities. Additionally, the local QFL model is vulnerable to noise produced by the quantum device's intermediate noise states, since it requires the use of quantum gates and circuits for training. This local noise becomes intertwined with learning parameters during training, impairing model precision and convergence rate. To address these issues, we propose a new QFL technique that incorporates differential privacy and introduces a dedicated noise estimation strategy to quantify and mitigate the impact of intermediate quantum noise. Furthermore, we design an adaptive noise generation scheme to alleviate privacy threats associated with the vanishing gradient variance phenomenon of QNN and enhance robustness against device noise. Experimental results demonstrate that our algorithm effectively balances convergence, reduces communication costs, and mitigates the adverse effects of intermediate quantum noise while maintaining strong privacy protection. Using real-world datasets, we achieved test accuracy of up to 98.47\% for the MNIST dataset and 83.85\% for the CIFAR-10 dataset while maintaining fast execution times.
Similar Papers
Differentially Private Federated Quantum Learning via Quantum Noise
Quantum Physics
Protects secret quantum computer training data.
Scaling Trust in Quantum Federated Learning: A Multi-Protocol Privacy Design
Cryptography and Security
Keeps private data safe during AI training.
Noise-Resilient Quantum Aggregation on NISQ for Federated ADAS Learning
Machine Learning (CS)
Cars learn together safely, even with bad connections.