Outlier-aware Tensor Robust Principal Component Analysis with Self-guided Data Augmentation
By: Yangyang Xu , Kexin Li , Li Yang and more
Potential Business Impact:
Cleans messy data better for clearer pictures.
Tensor Robust Principal Component Analysis (TRPCA) is a fundamental technique for decomposing multi-dimensional data into a low-rank tensor and an outlier tensor, yet existing methods relying on sparse outlier assumptions often fail under structured corruptions. In this paper, we propose a self-guided data augmentation approach that employs adaptive weighting to suppress outlier influence, reformulating the original TRPCA problem into a standard Tensor Principal Component Analysis (TPCA) problem. The proposed model involves an optimization-driven weighting scheme that dynamically identifies and downweights outlier contributions during tensor augmentation. We develop an efficient proximal block coordinate descent algorithm with closed-form updates to solve the resulting optimization problem, ensuring computational efficiency. Theoretical convergence is guaranteed through a framework combining block coordinate descent with majorization-minimization principles. Numerical experiments on synthetic and real-world datasets, including face recovery, background subtraction, and hyperspectral denoising, demonstrate that our method effectively handles various corruption patterns. The results show the improvements in both accuracy and computational efficiency compared to state-of-the-art methods.
Similar Papers
Robust Multilinear Principal Component Analysis
Methodology
Fixes messy data so computers can understand it.
Tensor robust principal component analysis via the tensor nuclear over Frobenius norm
Numerical Analysis
Cleans messy data by finding important patterns.
A Fast Iterative Robust Principal Component Analysis Method
Computational Engineering, Finance, and Science
Cleans messy data to find true patterns.