Communication Efficient Adaptive Model-Driven Quantum Federated Learning
By: Dev Gurung, Shiva Raj Pokhrel
Potential Business Impact:
Makes AI learn faster with less data.
Training with huge datasets and a large number of participating devices leads to bottlenecks in federated learning (FL). Furthermore, the challenges of heterogeneity between multiple FL clients affect the overall performance of the system. In a quantum federated learning (QFL) context, we address these three main challenges: i) training bottlenecks from massive datasets, ii) the involvement of a substantial number of devices, and iii) non-IID data distributions. We introduce a model-driven quantum federated learning algorithm (mdQFL) to tackle these challenges. Our proposed approach is efficient and adaptable to various factors, including different numbers of devices. To the best of our knowledge, it is the first to explore training and update personalization, as well as test generalization within a QFL setting, which can be applied to other FL scenarios. We evaluated the efficiency of the proposed mdQFL framework through extensive experiments under diverse non-IID data heterogeneity conditions using various datasets within the Qiskit environment. Our results demonstrate a nearly 50% decrease in total communication costs while maintaining or, in some cases, exceeding the accuracy of the final model and consistently improving local model training compared to the standard QFL baseline. Moreover, our experimental evaluation thoroughly explores the QFL and mdQFL algorithms, along with several influencing factors. In addition, we present a theoretical analysis to clarify the complexities of the proposed algorithm. The experimental code is available at 1.
Similar Papers
Quantum Federated Learning: Architectural Elements and Future Directions
Quantum Physics
Makes computers learn faster and safer together.
Enhancing Communication Efficiency in FL with Adaptive Gradient Quantization and Communication Frequency Optimization
Distributed, Parallel, and Cluster Computing
Makes phones train AI without sharing private info.
Towards Heterogeneous Quantum Federated Learning: Challenges and Solutions
Quantum Physics
Makes quantum computers learn together better.