Score: 0

Low-Rank Matrix Regression via Least-Angle Regression

Published: March 13, 2025 | arXiv ID: 2503.10569v2

By: Mingzhou Yin, Matthias A. Müller

Potential Business Impact:

Finds hidden patterns in data faster.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Low-rank matrix regression is a fundamental problem in data science with various applications in systems and control. Nuclear norm regularization has been widely applied to solve this problem due to its convexity. However, it suffers from high computational complexity and the inability to directly specify the rank. This work introduces a novel framework for low-rank matrix regression that addresses both unstructured and Hankel matrices. By decomposing the low-rank matrix into rank-1 bases, the problem is reformulated as an infinite-dimensional sparse learning problem. The least-angle regression (LAR) algorithm is then employed to solve this problem efficiently. For unstructured matrices, a closed-form LAR solution is derived with equivalence to a normalized nuclear norm regularization problem. For Hankel matrices, a real-valued polynomial basis reformulation enables effective LAR implementation. Two numerical examples in network modeling and system realization demonstrate that the proposed approach significantly outperforms the nuclear norm method in terms of estimation accuracy and computational efficiency.

Country of Origin
🇩🇪 Germany

Page Count
7 pages

Category
Electrical Engineering and Systems Science:
Systems and Control