Bayesian Markov-Switching Partial Reduced-Rank Regression
By: Maria F. Pintado , Matteo Iacopini , Luca Rossini and more
Reduced-Rank (RR) regression is a powerful dimensionality reduction technique but it overlooks any possible group configuration among the responses by assuming a low-rank structure on the entire coefficient matrix. Moreover, the temporal change of the relations between predictors and responses in time series induce a possibly time-varying grouping structure in the responses. To address these limitations, a Bayesian Markov-switching partial RR (MS-PRR) model is proposed, where the response vector is partitioned in two groups to reflect different complexity of the relationship. A \textit{simple} group assumes a low-rank linear regression, while a \textit{complex} group exploits nonparametric regression via a Gaussian Process. Differently from traditional approaches, group assignments and rank are treated as unknown parameters to be estimated. Then temporal persistence in the regression function is accounted for by a Markov-switching process that drives the changes in the grouping structure and model parameters over time. Full Bayesian inference is preformed via a partially collapsed Gibbs sampler, which allows uncertainty quantification without the need for trans-dimensional moves. Applications to two real-world macroeconomic and commodity data demonstrate the evidence of time-varying grouping and different degrees of complexity both across states and within each state.
Similar Papers
Simultaneous Heterogeneity and Reduced-rank Learning for Multivariate Response Regression
Methodology
Finds hidden groups in mixed data.
Regularized Reduced Rank Regression for mixed predictor and response variables
Methodology
Finds important patterns in messy, big data.
Higher Order Reduced Rank Regression
Machine Learning (Stat)
Finds hidden patterns in complex data.