Improved $\ell_{p}$ Regression via Iteratively Reweighted Least Squares
By: Alina Ene, Ta Duy Nguyen, Adrian Vladu
Potential Business Impact:
Makes computer math problems solve much faster.
We introduce fast algorithms for solving $\ell_{p}$ regression problems using the iteratively reweighted least squares (IRLS) method. Our approach achieves state-of-the-art iteration complexity, outperforming the IRLS algorithm by Adil-Peng-Sachdeva (NeurIPS 2019) and matching the theoretical bounds established by the complex algorithm of Adil-Kyng-Peng-Sachdeva (SODA 2019, J. ACM 2024) via a simpler lightweight iterative scheme. This bridges the existing gap between theoretical and practical algorithms for $\ell_{p}$ regression. Our algorithms depart from prior approaches, using a primal-dual framework, in which the update rule can be naturally derived from an invariant maintained for the dual objective. Empirically, we show that our algorithms significantly outperform both the IRLS algorithm by Adil-Peng-Sachdeva and MATLAB/CVX implementations.
Similar Papers
A Polynomial-time Algorithm for Online Sparse Linear Regression with Improved Regret Bound under Weaker Conditions
Machine Learning (CS)
Helps computers learn with less information.
Scalable approximation of the transformation-free linear simplicial-simplicial regression via constrained iterative reweighted least squares
Methodology
Speeds up math for analyzing parts of a whole.
Faster estimation of the transformation-free linear simplicial-simplicial regression via constrained iterative reweighted least squares
Methodology
Makes computer math faster and use less memory.