Score: 0

LLM-QUBO: An End-to-End Framework for Automated QUBO Transformation from Natural Language Problem Descriptions

Published: August 27, 2025 | arXiv ID: 2509.00099v1

By: Huixiang Zhang, Mahzabeen Emu, Salimur Choudhury

Potential Business Impact:

Lets computers solve hard problems faster.

Business Areas:
Quantum Computing Science and Engineering

Quantum annealing offers a promising paradigm for solving NP-hard combinatorial optimization problems, but its practical application is severely hindered by two challenges: the complex, manual process of translating problem descriptions into the requisite Quadratic Unconstrained Binary Optimization (QUBO) format and the scalability limitations of current quantum hardware. To address these obstacles, we propose a novel end-to-end framework, LLM-QUBO, that automates this entire formulation-to-solution pipeline. Our system leverages a Large Language Model (LLM) to parse natural language, automatically generating a structured mathematical representation. To overcome hardware limitations, we integrate a hybrid quantum-classical Benders' decomposition method. This approach partitions the problem, compiling the combinatorial complex master problem into a compact QUBO format, while delegating linearly structured sub-problems to classical solvers. The correctness of the generated QUBO and the scalability of the hybrid approach are validated using classical solvers, establishing a robust performance baseline and demonstrating the framework's readiness for quantum hardware. Our primary contribution is a synergistic computing paradigm that bridges classical AI and quantum computing, addressing key challenges in the practical application of optimization problem. This automated workflow significantly reduces the barrier to entry, providing a viable pathway to transform quantum devices into accessible accelerators for large-scale, real-world optimization challenges.

Country of Origin
🇨🇦 Canada

Page Count
8 pages

Category
Computer Science:
Machine Learning (CS)