Early-Stopped Mirror Descent for Linear Regression over Convex Bodies
By: Tobias Wegel, Gil Kur, Patrick Rebeschini
Potential Business Impact:
Finds best answers when data is tricky.
Early-stopped iterative optimization methods are widely used as alternatives to explicit regularization, and direct comparisons between early-stopping and explicit regularization have been established for many optimization geometries. However, most analyses depend heavily on the specific properties of the optimization geometry or strong convexity of the empirical objective, and it remains unclear whether early-stopping could ever be less statistically efficient than explicit regularization for some particular shape constraint, especially in the overparameterized regime. To address this question, we study the setting of high-dimensional linear regression under additive Gaussian noise when the ground truth is assumed to lie in a known convex body and the task is to minimize the in-sample mean squared error. Our main result shows that for any convex body and any design matrix, up to an absolute constant factor, the worst-case risk of unconstrained early-stopped mirror descent with an appropriate potential is at most that of the least squares estimator constrained to the convex body. We achieve this by constructing algorithmic regularizers based on the Minkowski functional of the convex body.
Similar Papers
The Hidden Cost of Approximation in Online Mirror Descent
Machine Learning (CS)
Makes computer learning more accurate with bad data.
Convex Regression with a Penalty
Machine Learning (Stat)
Fixes computer guesses about bumpy shapes.
Convergence Rates for Gradient Descent on the Edge of Stability in Overparametrised Least Squares
Machine Learning (CS)
Helps computers learn faster by finding better solutions.