Score: 0

Inexact Projected Preconditioned Gradient Methods with Variable Metrics: General Convergence Theory via Lyapunov Approach

Published: June 4, 2025 | arXiv ID: 2506.03671v1

By: Ruchi Guo, Jun Zou

Potential Business Impact:

Solves hard math problems faster for science.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Projected gradient methods are widely used for constrained optimization. A key application is for partial differential equations (PDEs), where the objective functional represents physical energy and the linear constraints enforce conservation laws. However, computing the projections onto the constraint set generally requires solving large-scale ill-conditioned linear systems. A common strategy is to relax projection accuracy and apply preconditioners, which leads to the inexact preconditioned projected gradient descent (IPPGD) methods studied here. However, the theoretical analysis and the dynamic behavior of the IPPGD methods, along with an effective construction of the inexact projection operator itself, all remain largely unexplored. We propose a strategy for constructing the inexact projection operator and develop a gradient-type flow to model the IPPGD methods. Discretization of this flow not only recovers the original IPPGD method but also yields a potentially faster novel method. Furthermore, we apply Lyapunov analysis, designing a delicate Lyapunov function, to prove the exponential convergence at the continuous level and linear convergence at the discrete level. We then apply the proposed method to solve nonlinear PDEs and present numerical results.

Country of Origin
🇨🇳 China

Page Count
24 pages

Category
Mathematics:
Optimization and Control