Compressed Bayesian Tensor Regression
By: Roberto Casarin, Radu Craiu, Qing Wang
Potential Business Impact:
Makes complex data analysis faster and more accurate.
To address the common problem of high dimensionality in tensor regressions, we introduce a generalized tensor random projection method that embeds high-dimensional tensor-valued covariates into low-dimensional subspaces with minimal loss of information about the responses. The method is flexible, allowing for tensor-wise, mode-wise, or combined random projections as special cases. A Bayesian inference framework is provided featuring the use of a hierarchical prior distribution and a low-rank representation of the parameter. Strong theoretical support is provided for the concentration properties of the random projection and posterior consistency of the Bayesian inference. An efficient Gibbs sampler is developed to perform inference on the compressed data. To mitigate the sensitivity introduced by random projections, Bayesian model averaging is employed, with normalising constants estimated using reverse logistic regression. An extensive simulation study is conducted to examine the effects of different tuning parameters. Simulations indicate, and the real data application confirms, that compressed Bayesian tensor regression can achieve better out-of-sample prediction while significantly reducing computational cost compared to standard Bayesian tensor regression.
Similar Papers
Variational Bayesian Logistic Tensor Regression with Application to Image Recognition
Methodology
Helps computers recognize pictures with less data.
High-dimensional low-rank matrix regression with unknown latent structures
Methodology
Finds patterns in data from many people.
Scalable Variable Selection and Model Averaging for Latent Regression Models Using Approximate Variational Bayes
Methodology
Finds best patterns in complex data faster.