Recursively Enumerably Representable Classes and Computable Versions of the Fundamental Theorem of Statistical Learning
By: David Kattermann, Lothar Sebastian Krapp
Potential Business Impact:
Teaches computers to learn from data better.
We study computable probably approximately correct (CPAC) learning, where learners are required to be computable functions. It had been previously observed that the Fundamental Theorem of Statistical Learning, which characterizes PAC learnability by finiteness of the Vapnik-Chervonenkis (VC-)dimension, no longer holds in this framework. Recent works recovered analogs of the Fundamental Theorem in the computable setting, for instance by introducing an effective VC-dimension. Guided by this, we investigate the connection between CPAC learning and recursively enumerable representable (RER) classes, whose members can be algorithmically listed. Our results show that the effective VC-dimensions can take arbitrary values above the traditional one, even for RER classes, which creates a whole family of (non-)examples for various notions of CPAC learning. Yet the two dimensions coincide for classes satisfying sufficiently strong notions of CPAC learning. We then observe that CPAC learnability can also be characterized via containment of RER classes that realize the same samples. Furthermore, it is shown that CPAC learnable classes satisfying a unique identification property are necessarily RER. Finally, we establish that agnostic learnability can be guaranteed for RER classes, by considering the relaxed notion of nonuniform CPAC learning.
Similar Papers
Samplability makes learning easier
Computational Complexity
Makes computers learn more with less data.
From learnable objects to learnable random objects
Logic in Computer Science
Teaches computers to learn from fewer examples.
Computational-Statistical Tradeoffs from NP-hardness
Computational Complexity
Makes computers learn harder things faster.