Improved Central Limit Theorem and Bootstrap Approximations for Linear Stochastic Approximation
By: Bogdan Butyrin , Eric Moulines , Alexey Naumov and more
Potential Business Impact:
Makes computer learning more accurate and faster.
In this paper, we refine the Berry-Esseen bounds for the multivariate normal approximation of Polyak-Ruppert averaged iterates arising from the linear stochastic approximation (LSA) algorithm with decreasing step size. We consider the normal approximation by the Gaussian distribution with covariance matrix predicted by the Polyak-Juditsky central limit theorem and establish the rate up to order $n^{-1/3}$ in convex distance, where $n$ is the number of samples used in the algorithm. We also prove a non-asymptotic validity of the multiplier bootstrap procedure for approximating the distribution of the rescaled error of the averaged LSA estimator. We establish approximation rates of order up to $1/\sqrt{n}$ for the latter distribution, which significantly improves upon the previous results obtained by Samsonov et al. (2024).
Similar Papers
Gaussian Approximation for Two-Timescale Linear Stochastic Approximation
Machine Learning (Stat)
Makes computer learning more accurate and faster.
High-Order Error Bounds for Markovian LSA with Richardson-Romberg Extrapolation
Machine Learning (Stat)
Makes computer learning more accurate and faster.
Statistical Inference for Linear Functionals of Online Least-squares SGD when $t \gtrsim d^{1+δ}$
Machine Learning (CS)
Makes computer learning more accurate with less data.