From Entropy to Epiplexity: Rethinking Information for Computationally Bounded Intelligence
By: Marc Finzi , Shikai Qiu , Yiding Jiang and more
Potential Business Impact:
Makes computers learn more from data than before.
Can we learn more from data than existed in the generating process itself? Can new and useful information be constructed from merely applying deterministic transformations to existing data? Can the learnable content in data be evaluated without considering a downstream task? On these questions, Shannon information and Kolmogorov complexity come up nearly empty-handed, in part because they assume observers with unlimited computational capacity and fail to target the useful information content. In this work, we identify and exemplify three seeming paradoxes in information theory: (1) information cannot be increased by deterministic transformations; (2) information is independent of the order of data; (3) likelihood modeling is merely distribution matching. To shed light on the tension between these results and modern practice, and to quantify the value of data, we introduce epiplexity, a formalization of information capturing what computationally bounded observers can learn from data. Epiplexity captures the structural content in data while excluding time-bounded entropy, the random unpredictable content exemplified by pseudorandom number generators and chaotic dynamical systems. With these concepts, we demonstrate how information can be created with computation, how it depends on the ordering of the data, and how likelihood modeling can produce more complex programs than present in the data generating process itself. We also present practical procedures to estimate epiplexity which we show capture differences across data sources, track with downstream performance, and highlight dataset interventions that improve out-of-distribution generalization. In contrast to principles of model selection, epiplexity provides a theoretical foundation for data selection, guiding how to select, generate, or transform data for learning systems.
Similar Papers
Information Physics of Intelligence: Unifying Logical Depth and Entropy under Thermodynamic Constraints
Information Theory
Makes AI smarter and use less energy.
Information Physics of Intelligence: Unifying Logical Depth and Entropy under Thermodynamic Constraints
Information Theory
Makes AI smarter by saving energy and time.
Information Physics of Intelligence: Unifying Logical Depth and Entropy under Thermodynamic Constraints
Information Theory
Makes AI smarter and use less energy.