High-dimensional low-rank matrix regression with unknown latent structures
By: Di Wang , Xiaoyu Zhang , Guodong Li and more
Potential Business Impact:
Finds patterns in data from many people.
We study low-rank matrix regression in settings where matrix-valued predictors and scalar responses are observed across multiple individuals. Rather than assuming a fully homogeneous coefficient matrices across individuals, we accommodate shared low-dimensional structure alongside individual-specific deviations. To this end, we introduce a tensor-structured homogeneity pursuit framework, wherein each coefficient matrix is represented as a product of shared low-rank subspaces and individualized low-rank loadings. We propose a scalable estimation procedure based on scaled gradient descent, and establish non-asymptotic bounds demonstrating that the proposed estimator attains improved convergence rates by leveraging shared information while preserving individual-specific signals. The framework is further extended to incorporate scaled hard thresholding for recovering sparse latent structures, with theoretical guarantees in both linear and generalized linear model settings. Our approach provides a principled middle ground between fully pooled and fully separate analyses, achieving strong theoretical performance, computational tractability, and interpretability in high-dimensional multi-individual matrix regression problems.
Similar Papers
Simultaneous Heterogeneity and Reduced-rank Learning for Multivariate Response Regression
Methodology
Finds hidden groups in mixed data.
Compressed Bayesian Tensor Regression
Methodology
Makes complex data analysis faster and more accurate.
Identifiability and Estimation in High-Dimensional Nonparametric Latent Structure Models
Statistics Theory
Find hidden patterns in complex data better.