Information-Theoretic Constraints on Variational Quantum Optimization: Efficiency Transitions and the Dynamical Lie Algebra
By: Jun Liang Tan
Potential Business Impact:
Makes quantum computers learn better by fixing information flow.
Variational quantum algorithms are leading candidates for near-term advantage, yet their scalability is fundamentally limited by the ``Barren Plateau'' phenomenon. While traditionally attributed to geometric concentration of measure, I propose an information-theoretic origin: a bandwidth bottleneck in the optimization feedback loop. By modeling the optimizer as a coherent Maxwell's Demon, I derive a thermodynamic constitutive relation, $ΔE \leq ηI(S:A)$, where work extraction is strictly bounded by the mutual information established via entanglement. I demonstrate that systems with polynomial Dynamical Lie Algebra (DLA) dimension exhibit ``Information Superconductivity'' (sustained $η> 0$), whereas systems with exponential DLA dimension undergo an efficiency collapse when the rate of information scrambling exceeds the ancilla's channel capacity. These results reframe quantum trainability as a thermodynamic phase transition governed by the stability of information flow.
Similar Papers
Information-Theoretic Constraints on Variational Quantum Optimization: Efficiency Transitions and the Dynamical Lie Algebra
Quantum Physics
Makes quantum computers learn better by controlling information.
Information Physics of Intelligence: Unifying Logical Depth and Entropy under Thermodynamic Constraints
Information Theory
Makes AI smarter by saving energy and time.
Information Physics of Intelligence: Unifying Logical Depth and Entropy under Thermodynamic Constraints
Information Theory
Makes AI smarter and use less energy.