Residual subspace evolution strategies for nonlinear inverse problems
By: Francesco Alemanno
Potential Business Impact:
Solves hard math problems without needing exact steps.
Nonlinear inverse problems often feature noisy, non-differentiable, or expensive residual evaluations that make Jacobian-based solvers unreliable. Popular derivative-free optimizers such as natural evolution strategies (NES) or Powell's NEWUOA still assume smoothness or expend many evaluations to maintain stability. Ensemble Kalman inversion (EKI) relies on empirical covariances that require preconditioning and scale poorly with residual dimension. We introduce residual subspace evolution strategies (RSES), a derivative-free solver that samples Gaussian probes around the current iterate, builds a residual-only surrogate from their differences, and recombines the probes through a least-squares solve yielding an optimal update without forming Jacobians or covariances. Each iteration costs $k+1$ residual evaluations, where $k \ll n$ for $n$-dimensional problems, with $O(k^3)$ linear algebra overhead. Benchmarks on calibration, regression, and deconvolution problems demonstrate consistent misfit reduction in both deterministic and stochastic settings. RSES matches or surpasses xNES and NEWUOA while staying competitive with EKI under matched evaluation budgets, particularly when smoothness or covariance assumptions fail.
Similar Papers
A general framework for Krylov ODE residuals with applications to randomized Krylov methods
Numerical Analysis
Speeds up math for science and engineering.
A Superlinearly Convergent Evolution Strategy
Optimization and Control
Makes computer programs solve problems faster.
Accuracy Boost in Ensemble Kalman Inversion via Ensemble Control Strategies
Numerical Analysis
Improves computer guesses for hard problems.