Score: 0

Matrix Rosenthal and Concentration Inequalities for Markov Chains with Applications in Statistical Learning

Published: August 6, 2025 | arXiv ID: 2508.04327v1

By: Yang Peng, Yuchen Xin, Zhihua Zhang

Potential Business Impact:

Improves computer learning with tricky data.

In this paper, we study moment and concentration inequalities of the spectral norm for sums of dependent random matrices. We establish novel Rosenthal-Burkholder inequalities for matrix martingales, as well as matrix Rosenthal, Hoeffding, and Bernstein inequalities for ergodic Markov chains. Compared with previous work on matrix concentration inequalities for Markov chains, our results do not require the assumptions of a non-zero absolute spectral gap and bounded matrix functions. Furthermore, our results have leading terms that match the Markov chain central limit theorem, rather than relying on variance proxies. We also give dimension-free versions of the inequalities, which are independent of the ambient dimension $d$ and relies on the effective rank of some matrix instead. This enables the generalization of our results to linear operators in infinite-dimensional Hilbert spaces. Our results have extensive applications in statistics and machine learning; in particular, we obtain improved bounds in covariance estimation and principal component analysis on Markovian data.

Country of Origin
🇨🇳 China

Page Count
40 pages

Category
Mathematics:
Probability