Towards Heterogeneous Quantum Federated Learning: Challenges and Solutions
By: Ratun Rahman , Dinh C. Nguyen , Christo Kurisummoottil Thomas and more
Potential Business Impact:
Makes quantum computers learn together better.
Quantum federated learning (QFL) combines quantum computing and federated learning to enable decentralized model training while maintaining data privacy. QFL can improve computational efficiency and scalability by taking advantage of quantum properties such as superposition and entanglement. However, existing QFL frameworks largely focus on homogeneity among quantum \textcolor{black}{clients, and they do not account} for real-world variances in quantum data distributions, encoding techniques, hardware noise levels, and computational capacity. These differences can create instability during training, slow convergence, and reduce overall model performance. In this paper, we conduct an in-depth examination of heterogeneity in QFL, classifying it into two categories: data or system heterogeneity. Then we investigate the influence of heterogeneity on training convergence and model aggregation. We critically evaluate existing mitigation solutions, highlight their limitations, and give a case study that demonstrates the viability of tackling quantum heterogeneity. Finally, we discuss potential future research areas for constructing robust and scalable heterogeneous QFL frameworks.
Similar Papers
Quantum Federated Learning: A Comprehensive Survey
Machine Learning (CS)
Lets computers learn secrets without sharing data.
Quantum Federated Learning: Architectural Elements and Future Directions
Quantum Physics
Makes computers learn faster and safer together.
When Federated Learning Meets Quantum Computing: Survey and Research Opportunities
Distributed, Parallel, and Cluster Computing
Makes learning faster and safer for computers.