Matrix Rosenthal and Concentration Inequalities for Markov Chains with Applications in Statistical Learning
By: Yang Peng, Yuchen Xin, Zhihua Zhang
Potential Business Impact:
Improves computer learning with tricky data.
In this paper, we study moment and concentration inequalities of the spectral norm for sums of dependent random matrices. We establish novel Rosenthal-Burkholder inequalities for matrix martingales, as well as matrix Rosenthal, Hoeffding, and Bernstein inequalities for ergodic Markov chains. Compared with previous work on matrix concentration inequalities for Markov chains, our results do not require the assumptions of a non-zero absolute spectral gap and bounded matrix functions. Furthermore, our results have leading terms that match the Markov chain central limit theorem, rather than relying on variance proxies. We also give dimension-free versions of the inequalities, which are independent of the ambient dimension $d$ and relies on the effective rank of some matrix instead. This enables the generalization of our results to linear operators in infinite-dimensional Hilbert spaces. Our results have extensive applications in statistics and machine learning; in particular, we obtain improved bounds in covariance estimation and principal component analysis on Markovian data.
Similar Papers
Matrix concentration inequalities for dependent binary random variables
Probability
Helps computers understand random data better.
Vector-valued self-normalized concentration inequalities beyond sub-Gaussianity
Machine Learning (Stat)
Helps computers learn faster from data.
A note on concentration inequalities for the overlapped batch mean variance estimators for Markov chains
Probability
Makes computer predictions more accurate for certain systems.