Robust Least-Squares Optimization for Data-Driven Predictive Control: A Geometric Approach
By: Shreyas Bharadwaj , Bamdev Mishra , Cyrus Mostajeran and more
Potential Business Impact:
Makes robots learn better with less data.
The paper studies a geometrically robust least-squares problem that extends classical and norm-based robust formulations. Rather than minimizing residual error for fixed or perturbed data, we interpret least-squares as enforcing approximate subspace inclusion between measured and true data spaces. The uncertainty in this geometric relation is modeled as a metric ball on the Grassmannian manifold, leading to a min-max problem over Euclidean and manifold variables. The inner maximization admits a closed-form solution, enabling an efficient algorithm with a transparent geometric interpretation. Applied to robust finite-horizon linear-quadratic tracking in data-enabled predictive control, the method improves upon existing robust least-squares formulations, achieving stronger robustness and favorable scaling under small uncertainty.
Similar Papers
Geometrically robust least squares through manifold optimization
Optimization and Control
Fixes messy data for computers to use.
Data-driven learning of feedback maps for explicit robust predictive control: an approximation theoretic view
Optimization and Control
Teaches robots to make smart choices safely.
A Geometric Approach to Problems in Optimization and Data Science
Optimization and Control
Makes computers learn better from messy or tricky information.