Quantum State Preparation via Large-Language-Model-Driven Evolution
By: Qing-Hong Cao , Zong-Yue Hou , Ying-Ying Li and more
Potential Business Impact:
Finds better ways to build quantum computers.
We propose an automated framework for quantum circuit design by integrating large-language models (LLMs) with evolutionary optimization to overcome the rigidity, scalability limitations, and expert dependence of traditional ones in variational quantum algorithms. Our approach (FunSearch) autonomously discovers hardware-efficient ans\"atze with new features of scalability and system-size-independent number of variational parameters entirely from scratch. Demonstrations on the Ising and XY spin chains with n = 9 qubits yield circuits containing 4 parameters, achieving near-exact energy extrapolation across system sizes. Implementations on quantum hardware (Zuchongzhi chip) validate practicality, where two-qubit quantum gate noises can be effectively mitigated via zero-noise extrapolations for a spin chain system as large as 20 sites. This framework bridges algorithmic design and experimental constraints, complementing contemporary quantum architecture search frameworks to advance scalable quantum simulations.
Similar Papers
Automating quantum feature map design via large language models
Quantum Physics
AI designs better ways for computers to learn.
Agent-Q: Fine-Tuning Large Language Models for Quantum Circuit Generation and Optimization
Quantum Physics
Makes computers design quantum computer programs.
From Understanding to Excelling: Template-Free Algorithm Design through Structural-Functional Co-Evolution
Software Engineering
Creates better computer programs than people can.