Samplability makes learning easier
By: Guy Blanc , Caleb Koch , Jane Lange and more
Potential Business Impact:
Makes computers learn more with less data.
The standard definition of PAC learning (Valiant 1984) requires learners to succeed under all distributions -- even ones that are intractable to sample from. This stands in contrast to samplable PAC learning (Blum, Furst, Kearns, and Lipton 1993), where learners only have to succeed under samplable distributions. We study this distinction and show that samplable PAC substantially expands the power of efficient learners. We first construct a concept class that requires exponential sample complexity in standard PAC but is learnable with polynomial sample complexity in samplable PAC. We then lift this statistical separation to the computational setting and obtain a separation relative to a random oracle. Our proofs center around a new complexity primitive, explicit evasive sets, that we introduce and study. These are sets for which membership is easy to determine but are extremely hard to sample from. Our results extend to the online setting to similarly show how its landscape changes when the adversary is assumed to be efficient instead of computationally unbounded.
Similar Papers
Computational-Statistical Tradeoffs from NP-hardness
Computational Complexity
Makes computers learn harder things faster.
From learnable objects to learnable random objects
Logic in Computer Science
Teaches computers to learn from fewer examples.
Recursively Enumerably Representable Classes and Computable Versions of the Fundamental Theorem of Statistical Learning
Machine Learning (CS)
Teaches computers to learn from data better.