End-to-End Analysis of Charge Stability Diagrams with Transformers
By: Rahul Marchand , Lucas Schorling , Cornelius Carlsson and more
Potential Business Impact:
Helps quantum computers work faster and better.
Transformer models and end-to-end learning frameworks are rapidly revolutionizing the field of artificial intelligence. In this work, we apply object detection transformers to analyze charge stability diagrams in semiconductor quantum dot arrays, a key task for achieving scalability with spin-based quantum computing. Specifically, our model identifies triple points and their connectivity, which is crucial for virtual gate calibration, charge state initialization, drift correction, and pulse sequencing. We show that it surpasses convolutional neural networks in performance on three different spin qubit architectures, all without the need for retraining. In contrast to existing approaches, our method significantly reduces complexity and runtime, while enhancing generalizability. The results highlight the potential of transformer-based end-to-end learning frameworks as a foundation for a scalable, device- and architecture-agnostic tool for control and tuning of quantum dot devices.
Similar Papers
Towards autonomous time-calibration of large quantum-dot devices: Detection, real-time feedback, and noise spectroscopy
Mesoscale and Nanoscale Physics
Keeps quantum computers working perfectly.
Benchmarking machine learning models for multi-class state recognition in double duantum dot data
CV and Pattern Recognition
Teaches computers to read quantum computer "fingerprints."
Topological Order in Deep State
Mesoscale and Nanoscale Physics
AI finds new states of matter with weird rules.