Minimax learning rates for estimating binary classifiers under margin conditions
By: Jonathan García, Philipp Petersen
Potential Business Impact:
Helps computers learn faster from data.
We study classification problems using binary estimators where the decision boundary is described by horizon functions and where the data distribution satisfies a geometric margin condition. We establish upper and lower bounds for the minimax learning rate over broad function classes with bounded Kolmogorov entropy in Lebesgue norms. A key novelty of our work is the derivation of lower bounds on the worst-case learning rates under a geometric margin condition -- a setting that is almost universally satisfied in practice but remains theoretically challenging. Moreover, our results deal with the noiseless setting, where lower bounds are particularly hard to establish. We apply our general results to classification problems with decision boundaries belonging to several function classes: for Barron-regular functions, and for H\"older-continuous functions with strong margins, we identify optimal rates close to the fast learning rates of $\mathcal{O}(n^{-1})$ for $n \in \mathbb{N}$ samples. Also for merely convex decision boundaries, in a strong margin case optimal rates near $\mathcal{O}(n^{-1/2})$ can be achieved.
Similar Papers
Adversarial learning for nonparametric regression: Minimax rate and adaptive estimation
Machine Learning (Stat)
Protects computers from tricky, fake data.
Minimax asymptotics
Statistics Theory
Helps find best guesses from many guesses.
Super-fast rates of convergence for Neural Networks Classifiers under the Hard Margin Condition
Machine Learning (CS)
Makes smart computers learn better from less data.