Low-Rank Matrix Regression via Least-Angle Regression
By: Mingzhou Yin, Matthias A. Müller
Potential Business Impact:
Finds hidden patterns in data faster.
Low-rank matrix regression is a fundamental problem in data science with various applications in systems and control. Nuclear norm regularization has been widely applied to solve this problem due to its convexity. However, it suffers from high computational complexity and the inability to directly specify the rank. This work introduces a novel framework for low-rank matrix regression that addresses both unstructured and Hankel matrices. By decomposing the low-rank matrix into rank-1 bases, the problem is reformulated as an infinite-dimensional sparse learning problem. The least-angle regression (LAR) algorithm is then employed to solve this problem efficiently. For unstructured matrices, a closed-form LAR solution is derived with equivalence to a normalized nuclear norm regularization problem. For Hankel matrices, a real-valued polynomial basis reformulation enables effective LAR implementation. Two numerical examples in network modeling and system realization demonstrate that the proposed approach significantly outperforms the nuclear norm method in terms of estimation accuracy and computational efficiency.
Similar Papers
A Probabilistic Basis for Low-Rank Matrix Learning
Machine Learning (Stat)
Improves computer guessing for missing data.
Norm-Bounded Low-Rank Adaptation
Machine Learning (CS)
Makes AI smarter and learn new things better.
Noisy Low-Rank Matrix Completion via Transformed $L_1$ Regularization and its Theoretical Properties
Statistics Theory
Fixes broken data by guessing missing pieces.