Score: 0

Preconditioning Natural and Second Order Gradient Descent in Quantum Optimization: A Performance Benchmark

Published: April 23, 2025 | arXiv ID: 2504.16518v1

By: Théo Lisart-Liebermann, Arcesio Castañeda Medina

Potential Business Impact:

Makes quantum computers learn faster and better.

Business Areas:
Quantum Computing Science and Engineering

The optimization of parametric quantum circuits is technically hindered by three major obstacles: the non-convex nature of the objective function, noisy gradient evaluations, and the presence of barren plateaus. As a result, the selection of classical optimizer becomes a critical factor in assessing and exploiting quantum-classical applications. One promising approach to tackle these challenges involves incorporating curvature information into the parameter update. The most prominent methods in this field are quasi-Newton and quantum natural gradient methods, which can facilitate faster convergence compared to first-order approaches. Second order methods however exhibit a significant trade-off between computational cost and accuracy, as well as heightened sensitivity to noise. This study evaluates the performance of three families of optimizers on synthetically generated MaxCut problems on a shallow QAOA algorithm. To address noise sensitivity and iteration cost, we demonstrate that incorporating secant-penalization in the BFGS update rule (SP-BFGS) yields improved outcomes for QAOA optimization problems, introducing a novel approach to stabilizing BFGS updates against gradient noise.

Page Count
43 pages

Category
Computer Science:
Computational Engineering, Finance, and Science