Score: 0

Highly robust factored principal component analysis for matrix-valued outlier accommodation and explainable detection via matrix minimum covariance determinant

Published: September 30, 2025 | arXiv ID: 2509.25957v1

By: Wenhui Wu , Changchun Shang , Jianhua Zhao and more

Potential Business Impact:

Finds bad data points in complex pictures.

Business Areas:
Big Data Data and Analytics

Principal component analysis (PCA) is a classical and widely used method for dimensionality reduction, with applications in data compression, computer vision, pattern recognition, and signal processing. However, PCA is designed for vector-valued data and encounters two major challenges when applied to matrix-valued data with heavy-tailed distributions or outliers: (1) vectorization disrupts the intrinsic matrix structure, leading to information loss and the curse of dimensionality, and (2) PCA is highly sensitive to outliers. Factored PCA (FPCA) addresses the first issue through probabilistic modeling, using a matrix normal distribution that explicitly represents row and column covariances via a separable covariance structure, thereby preserving the two-way dependency and matrix form of the data. Building on FPCA, we propose highly robust FPCA (HRFPCA), a robust extension that replaces maximum likelihood estimators with the matrix minimum covariance determinant (MMCD) estimators. This modification enables HRFPCA to retain FPCA's ability to model matrix-valued data while achieving a breakdown point close to 50\%, substantially improving resistance to outliers. Furthermore, HRFPCA produces the score--orthogonal distance analysis (SODA) plot, which effectively visualizes and classifies matrix-valued outliers. Extensive simulations and real-data analyses demonstrate that HRFPCA consistently outperforms competing methods in robustness and outlier detection, underscoring its effectiveness and broad applicability.

Page Count
18 pages

Category
Statistics:
Methodology