QAGT-MLP: An Attention-Based Graph Transformer for Small and Large-Scale Quantum Error Mitigation
By: Seyed Mohamad Ali Tousi, G. N. DeSouza
Potential Business Impact:
Makes quantum computers work better by fixing errors.
Noisy quantum devices demand error-mitigation techniques to be accurate yet simple and efficient in terms of number of shots and processing time. Many established approaches (e.g., extrapolation and quasi-probability cancellation) impose substantial execution or calibration overheads, while existing learning-based methods have difficulty scaling to large and deep circuits. In this research, we introduce QAGT-MLP: an attention-based graph transformer tailored for small- and large-scale quantum error mitigation (QEM). QAGT-MLP encodes each quantum circuit as a graph whose nodes represent gate instances and whose edges capture qubit connectivity and causal adjacency. A dual-path attention module extracts features around measured qubits at two scales or contexts: 1) graph-wide global structural context; and 2) fine-grained local lightcone context. These learned representations are concatenated with circuit-level descriptor features and the circuit noisy expected values, then they are passed to a lightweight MLP to predict the noise-mitigated values. On large-scale 100-qubit Trotterized 1D Transverse-Field Ising Models -- TFIM circuits -- the proposed QAGT-MLP outperformed state-of-the-art learning baselines in terms of mean error and error variability, demonstrating strong validity and applicability in real-world QEM scenarios under matched shot budgets. By using attention to fuse global structures with local lightcone neighborhoods, QAGT-MLP achieves high mitigation quality without the increasing noise scaling or resource demand required by classical QEM pipelines, while still offering a scalable and practical path to QEM in modern and future quantum workloads.
Similar Papers
QAGT-MLP: An Attention-Based Graph Transformer for Small and Large-Scale Quantum Error Mitigation
Emerging Technologies
Fixes errors in quantum computers.
Quantum Graph Attention Network: A Novel Quantum Multi-Head Attention Mechanism for Graph Learning
Machine Learning (CS)
Makes computers learn from messy data faster.
Quantum Graph Attention Network: A Novel Quantum Multi-Head Attention Mechanism for Graph Learning
Machine Learning (CS)
Makes computers learn from messy information faster.