An hybrid stochastic Newton algorithm for logistic regression
By: Bernard Bercu, Luis Fredes, Eméric Gbaguidi
Potential Business Impact:
Teaches computers to learn from data faster.
In this paper, we investigate a second-order stochastic algorithm for solving large-scale binary classification problems. We propose to make use of a new hybrid stochastic Newton algorithm that includes two weighted components in the Hessian matrix estimation: the first one coming from the natural Hessian estimate and the second associated with the stochastic gradient information. Our motivation comes from the fact that both parts evaluated at the true parameter of logistic regression, are equal to the Hessian matrix. This new formulation has several advantages and it enables us to prove the almost sure convergence of our stochastic algorithm to the true parameter. Moreover, we significantly improve the almost sure rate of convergence to the Hessian matrix. Furthermore, we establish the central limit theorem for our hybrid stochastic Newton algorithm. Finally, we show a surprising result on the almost sure convergence of the cumulative excess risk.
Similar Papers
Stochastic Gradients under Nuisances
Machine Learning (Stat)
Teaches computers to learn even with tricky, hidden info.
Non-Asymptotic Optimization and Generalization Bounds for Stochastic Gauss-Newton in Overparameterized Models
Machine Learning (CS)
Makes AI learn better by understanding its mistakes.
Nonlinear discretizations and Newton's method: characterizing stationary points of regression objectives
Machine Learning (CS)
Makes AI learn faster by using better math.