Score: 1

Improved Central Limit Theorem and Bootstrap Approximations for Linear Stochastic Approximation

Published: October 14, 2025 | arXiv ID: 2510.12375v1

By: Bogdan Butyrin , Eric Moulines , Alexey Naumov and more

Potential Business Impact:

Makes computer learning more accurate and faster.

Business Areas:
A/B Testing Data and Analytics

In this paper, we refine the Berry-Esseen bounds for the multivariate normal approximation of Polyak-Ruppert averaged iterates arising from the linear stochastic approximation (LSA) algorithm with decreasing step size. We consider the normal approximation by the Gaussian distribution with covariance matrix predicted by the Polyak-Juditsky central limit theorem and establish the rate up to order $n^{-1/3}$ in convex distance, where $n$ is the number of samples used in the algorithm. We also prove a non-asymptotic validity of the multiplier bootstrap procedure for approximating the distribution of the rescaled error of the averaged LSA estimator. We establish approximation rates of order up to $1/\sqrt{n}$ for the latter distribution, which significantly improves upon the previous results obtained by Samsonov et al. (2024).

Country of Origin
🇷🇺 🇫🇷 Russian Federation, France

Page Count
50 pages

Category
Statistics:
Machine Learning (Stat)