Global Convergence Analysis of Vanilla Gradient Descent for Asymmetric Matrix Completion
By: Xu Zhang , Shuo Chen , Jinsheng Li and more
Potential Business Impact:
Makes computers fill in missing data faster.
This paper investigates the asymmetric low-rank matrix completion problem, which can be formulated as an unconstrained non-convex optimization problem with a nonlinear least-squares objective function, and is solved via gradient descent methods. Previous gradient descent approaches typically incorporate regularization terms into the objective function to guarantee convergence. However, numerical experiments and theoretical analysis of the gradient flow both demonstrate that the elimination of regularization terms in gradient descent algorithms does not adversely affect convergence performance. By introducing the leave-one-out technique, we inductively prove that the vanilla gradient descent with spectral initialization achieves a linear convergence rate with high probability. Besides, we demonstrate that the balancing regularization term exhibits a small norm during iterations, which reveals the implicit regularization property of gradient descent. Empirical results show that our algorithm has a lower computational cost while maintaining comparable completion performance compared to other gradient descent algorithms.
Similar Papers
Stochastic Gradient Descent for Incomplete Tensor Linear Systems
Numerical Analysis
Fixes computer problems with missing information.
Global Convergence of Four-Layer Matrix Factorization under Random Initialization
Optimization and Control
Makes deep computer learning work better.
Global Convergence of Four-Layer Matrix Factorization under Random Initialization
Optimization and Control
Makes deep computer learning work better.