A preconditioned difference of convex functions algorithm with extrapolation and line search
By: Ran Zhang, Hongpeng Sun
Potential Business Impact:
Solves hard math problems faster and more accurately.
This paper proposes a novel proximal difference-of-convex (DC) algorithm enhanced with extrapolation and aggressive non-monotone line search for solving non-convex optimization problems. We introduce an adaptive conservative update strategy of the extrapolation parameter determined by a computationally efficient non-monotone line search. The core of our algorithm is to unite the update of the extrapolation parameter with the step size of the non-monotone line search interactively. The global convergence of the two proposed algorithms is established through the Kurdyka-{\L}ojasiewicz properties, ensuring convergence within a preconditioned framework for linear equations. Numerical experiments on two general non-convex problems: SCAD-penalized binary classification and graph-based Ginzburg-Landau image segmentation models, demonstrate the proposed method's high efficiency compared to existing DC algorithms both in convergence rate and solution accuracy.
Similar Papers
A preconditioned third-order implicit-explicit algorithm with a difference of varying convex functions and extrapolation
Optimization and Control
Solves hard math problems much faster.
A preconditioned third-order implicit-explicit algorithm with a difference of varying convex functions and extrapolation
Optimization and Control
Solves hard math problems much faster.
Tight Convergence Rates in Gradient Mapping for the Difference-of-Convex Algorithm
Optimization and Control
Improves how computers solve tricky math problems.