Towards Sharp Minimax Risk Bounds for Operator Learning
By: Ben Adcock, Gregor Maier, Rahul Parhi
We develop a minimax theory for operator learning, where the goal is to estimate an unknown operator between separable Hilbert spaces from finitely many noisy input-output samples. For uniformly bounded Lipschitz operators, we prove information-theoretic lower bounds together with matching or near-matching upper bounds, covering both fixed and random designs under Hilbert-valued Gaussian noise and Gaussian white noise errors. The rates are controlled by the spectrum of the covariance operator of the measure that defines the error metric. Our setup is very general and allows for measures with unbounded support. A key implication is a curse of sample complexity which shows that the minimax risk for generic Lipschitz operators cannot decay at any algebraic rate in the sample size. We obtain essentially sharp characterizations when the covariance spectrum decays exponentially and provide general upper and lower bounds in slower-decay regimes.
Similar Papers
Minimax Rates for the Estimation of Eigenpairs of Weighted Laplace-Beltrami Operators on Manifolds
Machine Learning (Stat)
Helps computers find hidden patterns in data.
Minimax testing in a statistical inverse problem with unknown operator
Statistics Theory
Finds hidden patterns in data, even when rules are unknown.
Estimation of discrete distributions with high probability under $χ^2$-divergence
Statistics Theory
Finds best way to guess patterns from data.