Borsuk-Ulam and Replicable Learning of Large-Margin Halfspaces
By: Ari Blondal , Hamed Hatami , Pooya Hatami and more
Potential Business Impact:
Teaches computers to learn from less data.
Recent remarkable advances in learning theory have established that, for total concept classes, list replicability, global stability, differentially private (DP) learnability, and shared-randomness replicability all coincide with the finiteness of Littlestone dimension. Does this equivalence extend to partial concept classes? We answer this question by proving that the list replicability number of $d$-dimensional $\gamma$-margin half-spaces satisfies \[ \frac{d}{2}+1 \le \mathrm{LR}(H^d_\gamma) \le d, \] which grows with dimension. Consequently, for partial classes, list replicability and global stability do not necessarily follow from bounded Littlestone dimension, pure DP-learnability, or shared-randomness replicability. Applying our main theorem, we resolve several open problems: $\bullet$ Every disambiguation of infinite-dimensional large-margin half-spaces to a total concept class has unbounded Littlestone dimension, answering an open question of Alon et al. (FOCS '21). $\bullet$ The maximum list-replicability number of any finite set of points and homogeneous half-spaces in $d$-dimensional Euclidean space is $d$, resolving a problem of Chase et al. (FOCS '23). $\bullet$ Every disambiguation of the Gap Hamming Distance problem in the large gap regime has unbounded public-coin randomized communication complexity. This answers an open question of Fang et al. (STOC '25). $\bullet$ There exists a partial concept class with Littlestone dimension $1$ such that all its disambiguations have infinite Littlestone dimension. This answers a question of Cheung et al. (ICALP '23). Our lower bound follows from a topological argument based on the local Borsuk-Ulam theorem of Chase, Chornomaz, Moran, and Yehudayoff (STOC '24). For the upper bound, we construct a list-replicable learning rule using the generalization properties of SVMs.
Similar Papers
Approximate Replicability in Learning
Machine Learning (CS)
Makes computer learning work even with messy data.
Private Learning of Littlestone Classes, Revisited
Machine Learning (Stat)
Makes learning private and faster for computers.
Learning Intersections of Two Margin Halfspaces under Factorizable Distributions
Machine Learning (CS)
Teaches computers to learn patterns faster.