Score: 0

Enhancing Distributional Robustness in Principal Component Analysis by Wasserstein Distances

Published: March 4, 2025 | arXiv ID: 2503.02494v2

By: Lei Wang, Xin Liu, Xiaojun Chen

Potential Business Impact:

Finds patterns even when data is messy.

Business Areas:
A/B Testing Data and Analytics

We consider the distributionally robust optimization (DRO) model of principal component analysis (PCA) to account for uncertainty in the underlying probability distribution. The resulting formulation leads to a nonsmooth constrained min-max optimization problem, where the ambiguity set captures the distributional uncertainty by the type-$2$ Wasserstein distance. We prove that the inner maximization problem admits a closed-form optimal value. This explicit characterization equivalently reformulates the original DRO model into a minimization problem on the Stiefel manifold with intricate nonsmooth terms, a challenging formulation beyond the reach of existing algorithms. To address this issue, we devise an efficient smoothing manifold proximal gradient algorithm. Our analysis establishes Riemannian gradient consistency and global convergence of our algorithm to a stationary point of the nonsmooth minimization problem. We also provide the iteration complexity $O(\epsilon^{-3})$ of our algorithm to achieve an $\epsilon$-approximate stationary point. Finally, numerical experiments are conducted to validate the effectiveness and scalability of our algorithm, as well as to highlight the necessity and rationality of adopting the DRO model for PCA.

Country of Origin
🇭🇰 Hong Kong

Page Count
24 pages

Category
Mathematics:
Optimization and Control