From Graphical Lasso to Atomic Norms: High-Dimensional Pattern Recovery
By: Piotr Graczyk , Bartosz Kołodziejek , Hideto Nakashima and more
Potential Business Impact:
Finds hidden patterns in complex data.
Estimating high-dimensional precision matrices is a fundamental problem in modern statistics, with the graphical lasso and its $\ell_1$-penalty being a standard approach for recovering sparsity patterns. However, many statistical models, e.g. colored graphical models, exhibit richer structures like symmetry or equality constraints, which the $\ell_1$-norm cannot adequately capture. This paper addresses the gap by extending the high-dimensional analysis of pattern recovery to a general class of atomic norm penalties, particularly those whose unit balls are polytopes, where patterns correspond to the polytope's facial structure. We establish theoretical guarantees for recovering the true pattern induced by these general atomic norms in precision matrix estimation. Our framework builds upon and refines the primal-dual witness methodology of Ravikumar et al. (2011). Our analysis provides conditions on the deviation between sample and true covariance matrices for successful pattern recovery, given a novel, generalized irrepresentability condition applicable to any atomic norm. When specialized to the $\ell_1$-penalty, our results offer improved conditions -- including weaker deviation requirements and a less restrictive irrepresentability condition -- leading to tighter bounds and better asymptotic performance than prior work. The proposed general irrepresentability condition, based on a new thresholding concept, provides a unified perspective on model selection consistency. Numerical examples demonstrate the tightness of the derived theoretical bounds.
Similar Papers
Asymptotic Distribution of Low-Dimensional Patterns Induced by Non-Differentiable Regularizers under General Loss Functions
Statistics Theory
Finds hidden patterns in data, even complex ones.
Universal Architectures for the Learning of Polyhedral Norms and Convex Regularizers
Machine Learning (Stat)
Cleans up blurry pictures better than old ways.
A Graphical Global Optimization Framework for Parameter Estimation of Statistical Models with Nonconvex Regularization Functions
Optimization and Control
Solves hard math puzzles for computers faster.