Marginal minimization and sup-norm expansions in perturbed optimization
By: Vladimir Spokoiny
Potential Business Impact:
Finds the best answer by ignoring unimportant details.
Let the objective unction \( f \) depends on the target variable \( x \) along with a nuisance variable \( s \): \( f(v) = f(x,s) \). The goal is to identify the marginal solution \( x^{*} = \arg\min_{x} \min_{s} f(x,s) \). This paper discusses three related problems. The plugin approach widely used e.g. in inverse problems suggests to use a preliminary guess (pilot) \( \hat{s} \) and apply the solution of the partial optimization \( \hat{x} = \arg\min_{x} f(x,\hat{s}) \). The main question to address within this approach is the required quality of the pilot ensuring the prescribed accuracy of \( \hat{x} \). The popular \emph{alternating optimization} approach suggests the following procedure: given a starting guess \( x_{0} \), for \( t \geq 1 \), define \( s_{t} = \arg\min_{s} f(x_{t-1},s) \), and then \( x_{t} = \arg\min_{x} f(x,s_{t}) \). The main question here is the set of conditions ensuring a convergence of \( x_{t} \) to \( x^{*} \). Finally, the paper discusses an interesting connection between marginal optimization and sup-norm estimation. The basic idea is to consider one component of the variable \( v \) as a target and the rest as nuisance. In all cases, we provide accurate closed form results under realistic assumptions. The results are illustrated by one numerical example for the BTL model.
Similar Papers
Semiparametric plug-in estimation, sup-norm risk bounds, marginal optimization, and inference in BTL model
Statistics Theory
Helps compare many things with less data.
New Results on a General Class of Minimum Norm Optimization Problems
Data Structures and Algorithms
Find best solutions for complex problems.
Minimax asymptotics
Statistics Theory
Helps find best guesses from many guesses.