Score: 0

First and Second Order Approximations to Stochastic Gradient Descent Methods with Momentum Terms

Published: April 18, 2025 | arXiv ID: 2504.13992v1

By: Eric Lu

Potential Business Impact:

Makes computer learning faster with changing steps.

Business Areas:
A/B Testing Data and Analytics

Stochastic Gradient Descent (SGD) methods see many uses in optimization problems. Modifications to the algorithm, such as momentum-based SGD methods have been known to produce better results in certain cases. Much of this, however, is due to empirical information rather than rigorous proof. While the dynamics of gradient descent methods can be studied through continuous approximations, existing works only cover scenarios with constant learning rates or SGD without momentum terms. We present approximation results under weak assumptions for SGD that allow learning rates and momentum parameters to vary with respect to time.

Page Count
28 pages

Category
Computer Science:
Machine Learning (CS)