Regularized Parameter Estimation in Mixed Model Trace Regression
By: Ian Hultman, Sanvesh Srivastava
Potential Business Impact:
Finds patterns in complex data, like images.
We introduce mixed model trace regression (MMTR), a mixed model linear regression extension for scalar responses and high-dimensional matrix-valued covariates. MMTR's fixed effects component is equivalent to trace regression, with an element-wise lasso penalty imposed on the regression coefficients matrix to facilitate the estimation of a sparse mean parameter. MMTR's key innovation lies in modeling the covariance structure of matrix-variate random effects as a Kronecker product of low-rank row and column covariance matrices, enabling sparse estimation of the covariance parameter through low-rank constraints. We establish identifiability conditions for the estimation of row and column covariance matrices and use them for rank selection by applying group lasso regularization on the columns of their respective Cholesky factors. We develop an Expectation-Maximization (EM) algorithm extension for numerically stable parameter estimation in high-dimensional applications. MMTR achieves estimation accuracy comparable to leading regularized quasi-likelihood competitors across diverse simulation studies and attains the lowest mean square prediction error compared to its competitors on a publicly available image dataset.
Similar Papers
Regularized Reduced Rank Regression for mixed predictor and response variables
Methodology
Finds important patterns in messy, big data.
Multivariate MM-estimators with auxiliary Scale for Linear Models with Structured Covariance Matrices
Statistics Theory
Makes computer models ignore bad data points.
Generalized Tree-Informed Mixed Model Regression
Methodology
Helps predict country money using data.