Structured covariance estimation via tensor-train decomposition
By: Artsiom Patarusau , Nikita Puchkin , Maxim Rakhuba and more
Potential Business Impact:
Finds patterns in complex data faster.
We consider a problem of covariance estimation from a sample of i.i.d. high-dimensional random vectors. To avoid the curse of dimensionality we impose an additional assumption on the structure of the covariance matrix $\Sigma$. To be more precise we study the case when $\Sigma$ can be approximated by a sum of double Kronecker products of smaller matrices in a tensor train (TT) format. Our setup naturally extends widely known Kronecker sum and CANDECOMP/PARAFAC models but admits richer interaction across modes. We suggest an iterative polynomial time algorithm based on TT-SVD and higher-order orthogonal iteration (HOOI) adapted to Tucker-2 hybrid structure. We derive non-asymptotic dimension-free bounds on the accuracy of covariance estimation taking into account hidden Kronecker product and tensor train structures. The efficiency of our approach is illustrated with numerical experiments.
Similar Papers
Structured covariance estimation via tensor-train decomposition
Statistics Theory
Finds patterns in complex data faster.
Tensor Stochastic Regression for High-dimensional Time Series via CP Decomposition
Methodology
Finds patterns in complex data over time.
An Efficient and Interpretable Autoregressive Model for High-Dimensional Tensor-Valued Time Series
Methodology
Finds patterns in weather to predict future changes.