Admissibility Breakdown in High-Dimensional Sparse Regression with L1 Regularization
By: Guo Liu
The choice of the tuning parameter in the Lasso is central to its statistical performance in high-dimensional linear regression. Classical consistency theory identifies the rate of the Lasso tuning parameter, and numerous studies have established non-asymptotic guarantees. Nevertheless, the question of optimal tuning within a non-asymptotic framework has not yet been fully resolved. We establish tuning criteria above which the Lasso becomes inadmissible under mean squared prediction error. More specifically, we establish thresholds showing that certain classical tuning choices yield Lasso estimators strictly dominated by a simple Lasso-Ridge refinement. We also address how the structure of the design matrix and the noise vector influences the inadmissibility phenomenon.
Similar Papers
Lasso Penalization for High-Dimensional Beta Regression Models: Computation, Analysis, and Inference
Methodology
Finds important factors in proportion data.
Lasso-Ridge Refitting: A Two-Stage Estimator for High-Dimensional Linear Regression
Methodology
Makes computer predictions more accurate and reliable.
An Easily Tunable Approach to Robust and Sparse High-Dimensional Linear Regression
Statistics Theory
Finds hidden patterns even with messy data.