The Root Finding Problem Revisited: Beyond the Robbins-Monro procedure
By: Yue Yu, Moulinath Banerjee, Ya'acov Ritov
Potential Business Impact:
Finds answers faster, even when it's tricky.
We introduce Sequential Probability Ratio Bisection (SPRB), a novel stochastic approximation algorithm that adapts to the local behavior of the (regression) function of interest around its root. We establish theoretical guarantees for SPRB's asymptotic performance, showing that it achieves the optimal convergence rate and minimal asymptotic variance even when the target function's derivative at the root is small (at most half the step size), a regime where the classical Robbins-Monro procedure typically suffers reduced convergence rates. Further, we show that if the regression function is discontinuous at the root, Robbins-Monro converges at a rate of $1/n$ whilst SPRB attains exponential convergence. If the regression function has vanishing first-order derivative, SPRB attains a faster rate of convergence compared to stochastic approximation. As part of our analysis, we derive a nonasymptotic bound on the expected sample size and establish a generalized Central Limit Theorem under random stopping times. Remarkably, SPRB automatically provides nonasymptotic time-uniform confidence sequences that do not explicitly require knowledge of the convergence rate. We demonstrate the practical effectiveness of SPRB through simulation results.
Similar Papers
An Iterative Bayesian Robbins--Monro Sequence
Optimization and Control
Finds answers faster with less guessing.
A Simple Geometric Proof of the Optimality of the Sequential Probability Ratio Test for Symmetric Bernoulli Hypotheses
Statistics Theory
Find coin bias faster with fewer flips.
RS-ORT: A Reduced-Space Branch-and-Bound Algorithm for Optimal Regression Trees
Machine Learning (CS)
Builds smarter computer predictions from data faster.