Stabilization of the Gradient Method for Solving Linear Algebraic Systems -- A Method Related to the Normal Equation
By: Ibrahima Dione
Potential Business Impact:
Makes computer math problems solve much faster.
Although it is relatively easy to apply, the gradient method often displays a disappointingly slow rate of convergence. Its convergence is specially based on the structure of the matrix of the algebraic linear system, and on the choice of the stepsize defining the new iteration. We propose here a simple and robust stabilization of the gradient method, which no longer assumes a structure on the matrix (neither symmetric, nor positive definite) to converge, and which no longer requires an approach on the choice of the stepsize. We establish the global convergence of the proposed stabilized algorithm under the only assumption of nonsingular matrix. Several numerical examples illustrating its performances are presented, where we have tested small and large scale linear systems, with and not structured matrices, and with well and ill conditioned matrices.
Similar Papers
A stochastic column-block gradient descent method for solving nonlinear systems of equations
Numerical Analysis
Solves hard math problems faster than before.
On the convergence of two-step modified Newton method for nonsymmetric algebraic Riccati equations from transport theory
Numerical Analysis
Speeds up computer calculations for nuclear reactors.
A Smoothing Newton Method for Rank-one Matrix Recovery
Machine Learning (Stat)
Fixes a computer math trick for better results.