Score: 1

Gradient Methods with Online Scaling Part II. Practical Aspects

Published: September 13, 2025 | arXiv ID: 2509.11007v1

By: Ya-Chi Chu , Wenzhi Gao , Yinyu Ye and more

BigTech Affiliations: Stanford University

Potential Business Impact:

Makes computer learning faster and use less memory.

Business Areas:
A/B Testing Data and Analytics

Part I of this work [Gao25] establishes online scaled gradient methods (OSGM), a framework that utilizes online convex optimization to adapt stepsizes in gradient methods. This paper focuses on the practical aspects of OSGM. We leverage the OSGM framework to design new adaptive first-order methods and provide insights into their empirical behavior. The resulting method, OSGM-Best, matches the performance of quasi-Newton variants while requiring less memory and cheaper iterations. We also extend OSGM to nonconvex optimization and outline directions that connect OSGM to existing branches of optimization theory and practice.

Country of Origin
🇺🇸 United States

Page Count
40 pages

Category
Mathematics:
Optimization and Control