Score: 0

Nonlinear discretizations and Newton's method: characterizing stationary points of regression objectives

Published: October 13, 2025 | arXiv ID: 2510.11987v1

By: Conor Rowan

Potential Business Impact:

Makes AI learn faster by using better math.

Business Areas:
Quantum Computing Science and Engineering

Second-order methods are emerging as promising alternatives to standard first-order optimizers such as gradient descent and ADAM for training neural networks. Though the advantages of including curvature information in computing optimization steps have been celebrated in the scientific machine learning literature, the only second-order methods that have been studied are quasi-Newton, meaning that the Hessian matrix of the objective function is approximated. Though one would expect only to gain from using the true Hessian in place of its approximation, we show that neural network training reliably fails when relying on exact curvature information. The failure modes provide insight both into the geometry of nonlinear discretizations as well as the distribution of stationary points in the loss landscape, leading us to question the conventional wisdom that the loss landscape is replete with local minima.

Country of Origin
🇺🇸 United States

Page Count
23 pages

Category
Computer Science:
Machine Learning (CS)