Score: 0

Randomized coordinate gradient descent almost surely escapes strict saddle points

Published: August 11, 2025 | arXiv ID: 2508.07535v1

By: Ziang Chen, Yingzhou Li, Zihao Li

Potential Business Impact:

Escapes tricky math problems by avoiding dead ends.

We analyze the behavior of randomized coordinate gradient descent for nonconvex optimization, proving that under standard assumptions, the iterates almost surely escape strict saddle points. By formulating the method as a nonlinear random dynamical system and characterizing neighborhoods of critical points, we establish this result through the center-stable manifold theorem.

Page Count
23 pages

Category
Mathematics:
Optimization and Control