Layerwise Federated Learning for Heterogeneous Quantum Clients using Quorus
By: Jason Han , Nicholas S. DiBrita , Daniel Leeds and more
Potential Business Impact:
Trains AI on different, faulty quantum computers.
Quantum machine learning (QML) holds the promise to solve classically intractable problems, but, as critical data can be fragmented across private clients, there is a need for distributed QML in a quantum federated learning (QFL) format. However, the quantum computers that different clients have access to can be error-prone and have heterogeneous error properties, requiring them to run circuits of different depths. We propose a novel solution to this QFL problem, Quorus, that utilizes a layerwise loss function for effective training of varying-depth quantum models, which allows clients to choose models for high-fidelity output based on their individual capacity. Quorus also presents various model designs based on client needs that optimize for shot budget, qubit count, midcircuit measurement, and optimization space. Our simulation and real-hardware results show the promise of Quorus: it increases the magnitude of gradients of higher depth clients and improves testing accuracy by 12.4% on average over the state-of-the-art.
Similar Papers
Towards Heterogeneous Quantum Federated Learning: Challenges and Solutions
Quantum Physics
Makes quantum computers learn together better.
Multi-Layer Hierarchical Federated Learning with Quantization
Machine Learning (CS)
Lets many computers learn together better.
Sporadic Federated Learning Approach in Quantum Environment to Tackle Quantum Noise
Quantum Physics
Fixes noisy quantum computers for better AI.