Conserved active information
By: Yanchen Chen, Daniel Andrés Díaz-Pachón
We introduce conserved active information $I^\oplus$, a symmetric extension of active information that quantifies net information gain/loss across the entire search space, respecting No-Free-Lunch conservation. Through Bernoulli and uniform-baseline examples, we show $I^\oplus$ reveals regimes hidden from KL divergence, such as when strong knowledge reduces global disorder. Such regimes are proven formally under uniform baseline, distinguishing disorder (increasing mild knowledge from order-imposing strong knowledge. We further illustrate these regimes with examples from Markov chains and cosmological fine-tuning. This resolves a longstanding critique of active information while enabling applications in search, optimization, and beyond.
Similar Papers
Active inference and artificial reasoning
Neurons and Cognition
Helps robots learn about the world faster.
Information Physics of Intelligence: Unifying Logical Depth and Entropy under Thermodynamic Constraints
Information Theory
Makes AI smarter by saving energy and time.
Information Physics of Intelligence: Unifying Logical Depth and Entropy under Thermodynamic Constraints
Information Theory
Makes AI smarter and use less energy.