Robust and Well-conditioned Sparse Estimation for High-dimensional Covariance Matrices
By: Shaoxin Wang, Ziyun Ma
Estimating covariance matrices with high-dimensional complex data presents significant challenges, particularly concerning positive definiteness, sparsity, and numerical stability. Existing robust sparse estimators often fail to guarantee positive definiteness in finite samples, while subsequent positive-definite correction can degrade sparsity and lack explicit control over the condition number. To address these limitations, we propose a novel robust and well-conditioned sparse covariance matrix estimator. Our key innovation is the direct incorporation of a condition number constraint within a robust adaptive thresholding framework. This constraint simultaneously ensures positive definiteness, enforces a controllable level of numerical stability, and preserves the desired sparse structure without resorting to post-hoc modifications that compromise sparsity. We formulate the estimation as a convex optimization problem and develop an efficient alternating direction algorithm with guaranteed convergence. Theoretically, we establish that the proposed estimator achieves the minimax optimal convergence rate under the Frobenius norm. Comprehensive simulations and real-data applications demonstrate that our method consistently produces positive definite, well-conditioned, and sparse estimates, and achieves comparable or superior numerical stability to eigenvalue-bound methods while requiring less tuning parameters.
Similar Papers
A Sparse Linear Model for Positive Definite Estimation of Covariance Matrices
Methodology
Finds hidden connections in complex data.
SCOPE: Spectral Concentration by Distributionally Robust Joint Covariance-Precision Estimation
Machine Learning (Stat)
Makes computer math better for tricky data.
Robust Sparse Precision Matrix Estimation and its Application
Methodology
Finds hidden patterns in messy data better.