Score: 1

Global Convergence Analysis of Vanilla Gradient Descent for Asymmetric Matrix Completion

Published: August 13, 2025 | arXiv ID: 2508.09685v1

By: Xu Zhang , Shuo Chen , Jinsheng Li and more

BigTech Affiliations: Huawei

Potential Business Impact:

Makes computers fill in missing data faster.

This paper investigates the asymmetric low-rank matrix completion problem, which can be formulated as an unconstrained non-convex optimization problem with a nonlinear least-squares objective function, and is solved via gradient descent methods. Previous gradient descent approaches typically incorporate regularization terms into the objective function to guarantee convergence. However, numerical experiments and theoretical analysis of the gradient flow both demonstrate that the elimination of regularization terms in gradient descent algorithms does not adversely affect convergence performance. By introducing the leave-one-out technique, we inductively prove that the vanilla gradient descent with spectral initialization achieves a linear convergence rate with high probability. Besides, we demonstrate that the balancing regularization term exhibits a small norm during iterations, which reveals the implicit regularization property of gradient descent. Empirical results show that our algorithm has a lower computational cost while maintaining comparable completion performance compared to other gradient descent algorithms.

Country of Origin
🇨🇳 China

Page Count
19 pages

Category
Computer Science:
Machine Learning (CS)