Scalable Fitting Methods for Multivariate Gaussian Additive Models with Covariate-dependent Covariance Matrices
By: Vincenzo Gioia , Matteo Fasiolo , Ruggero Bellio and more
Potential Business Impact:
Helps computers understand complex data patterns better.
We propose efficient computational methods to fit multivariate Gaussian additive models, where the mean vector and the covariance matrix are allowed to vary with covariates, in an empirical Bayes framework. To guarantee the positive-definiteness of the covariance matrix, we model the elements of an unconstrained parametrisation matrix, focussing particularly on the modified Cholesky decomposition and the matrix logarithm. A key computational challenge arises from the fact that, for the model class considered here, the number of parameters increases quadratically with the dimension of the response vector. Hence, here we discuss how to achieve fast computation and low memory footprint in moderately high dimensions, by exploiting parsimonious model structures, sparse derivative systems and by employing block-oriented computational methods. Methods for building and fitting multivariate Gaussian additive models are provided by the SCM R package, available at https://github.com/VinGioia90/SCM, while the code for reproducing the results in this paper is available at https://github.com/VinGioia90/SACM.
Similar Papers
Scalable Computations for Generalized Mixed Effects Models with Crossed Random Effects Using Krylov Subspace Methods
Methodology
Makes complex math problems solve much faster.
Robust Variable Selection in High-dimensional Nonparametric Additive Model
Methodology
Finds important patterns even with messy data.
Gaussian Mixture Model with unknown diagonal covariances via continuous sparse regularization
Statistics Theory
Finds hidden groups in data, even with messy details.