A Saddle Point Algorithm for Robust Data-Driven Factor Model Problems
By: Shabnam Khodakaramzadeh , Soroosh Shafiee , Gabriel de Albuquerque Gleizer and more
Potential Business Impact:
Find hidden patterns in big data faster.
We study the factor model problem, which aims to uncover low-dimensional structures in high-dimensional datasets. Adopting a robust data-driven approach, we formulate the problem as a saddle-point optimization. Our primary contribution is a general first-order algorithm that solves this reformulation by leveraging a linear minimization oracle (LMO). We further develop semi-closed form solutions (up to a scalar) for three specific LMOs, corresponding to the Frobenius norm, Kullback-Leibler divergence, and Gelbrich (aka Wasserstein) distance. The analysis includes explicit quantification of these LMOs' regularity conditions, notably the Lipschitz constants of the dual function, whthich govern the algorithm's convergence performance. Numerical experiments confirm our meod's effectiveness in high-dimensional settings, outperforming standard off-the-shelf optimization solvers.
Similar Papers
Exactly or Approximately Wasserstein Distributionally Robust Estimation According to Wasserstein Radii Being Small or Large
Signal Processing
Makes computer guesses more accurate with noisy data.
A Saddle Point Remedy: Power of Variable Elimination in Non-convex Optimization
Machine Learning (CS)
Simplifies hard math problems for smarter computers.
Geometrically robust least squares through manifold optimization
Optimization and Control
Fixes messy data for computers to use.