Score: 0

Comparing BFGS and OGR for Second-Order Optimization

Published: December 7, 2025 | arXiv ID: 2512.06969v1

By: Adrian Przybysz , Mikołaj Kołek , Franciszek Sobota and more

Potential Business Impact:

Helps computers learn faster, even with tricky problems.

Business Areas:
A/B Testing Data and Analytics

Estimating the Hessian matrix, especially for neural network training, is a challenging problem due to high dimensionality and cost. In this work, we compare the classical Sherman-Morrison update used in the popular BFGS method (Broy-den-Fletcher-Goldfarb-Shanno), which maintains a positive definite Hessian approximation under a convexity assumption, with a novel approach called Online Gradient Regression (OGR). OGR performs regression of gradients against positions using an exponential moving average to estimate second derivatives online, without requiring Hessian inversion. Unlike BFGS, OGR allows estimation of a general (not necessarily positive definite) Hessian and can thus handle non-convex structures. We evaluate both methods across standard test functions and demonstrate that OGR achieves faster convergence and improved loss, particularly in non-convex settings.

Page Count
8 pages

Category
Computer Science:
Machine Learning (CS)