Score: 0

A Fully Polynomial-Time Algorithm for Robustly Learning Halfspaces over the Hypercube

Published: November 10, 2025 | arXiv ID: 2511.07244v1

By: Gautam Chandrasekaran , Adam R. Klivans , Konstantinos Stavropoulos and more

Potential Business Impact:

Teaches computers to learn from messy, bad data.

Business Areas:
A/B Testing Data and Analytics

We give the first fully polynomial-time algorithm for learning halfspaces with respect to the uniform distribution on the hypercube in the presence of contamination, where an adversary may corrupt some fraction of examples and labels arbitrarily. We achieve an error guarantee of $\eta^{O(1)}+\epsilon$ where $\eta$ is the noise rate. Such a result was not known even in the agnostic setting, where only labels can be adversarially corrupted. All prior work over the last two decades has a superpolynomial dependence in $1/\epsilon$ or succeeds only with respect to continuous marginals (such as log-concave densities). Previous analyses rely heavily on various structural properties of continuous distributions such as anti-concentration. Our approach avoids these requirements and makes use of a new algorithm for learning Generalized Linear Models (GLMs) with only a polylogarithmic dependence on the activation function's Lipschitz constant. More generally, our framework shows that supervised learning with respect to discrete distributions is not as difficult as previously thought.

Country of Origin
🇺🇸 United States

Page Count
52 pages

Category
Computer Science:
Data Structures and Algorithms