Higher Order Reduced Rank Regression
By: Leia Greenberg, Haim Avron
Potential Business Impact:
Finds hidden patterns in complex data.
Reduced Rank Regression (RRR) is a widely used method for multi-response regression. However, RRR assumes a linear relationship between features and responses. While linear models are useful and often provide a good approximation, many real-world problems involve more complex relationships that cannot be adequately captured by simple linear interactions. One way to model such relationships is via multilinear transformations. This paper introduces Higher Order Reduced Rank Regression (HORRR), an extension of RRR that leverages multi-linear transformations, and as such is capable of capturing nonlinear interactions in multi-response regression. HORRR employs tensor representations for the coefficients and a Tucker decomposition to impose multilinear rank constraints as regularization akin to the rank constraints in RRR. Encoding these constraints as a manifold allows us to use Riemannian optimization to solve this HORRR problems. We theoretically and empirically analyze the use of Riemannian optimization for solving HORRR problems.
Similar Papers
Simultaneous Heterogeneity and Reduced-rank Learning for Multivariate Response Regression
Methodology
Finds hidden groups in mixed data.
Regularized Reduced Rank Regression for mixed predictor and response variables
Methodology
Finds important patterns in messy, big data.
Low-Rank Matrix Regression via Least-Angle Regression
Systems and Control
Finds hidden patterns in data faster.