Information Physics of Intelligence: Unifying Logical Depth and Entropy under Thermodynamic Constraints
By: Jianfeng Xu, Zeyan Li
Potential Business Impact:
Makes AI smarter and use less energy.
The rapid scaling of artificial intelligence models has revealed a fundamental tension between model capacity (storage) and inference efficiency (computation). While classical information theory focuses on transmission and storage limits, it lacks a unified physical framework to quantify the thermodynamic costs of generating information from compressed laws versus retrieving it from memory. In this paper, we propose a theoretical framework that treats information processing as an enabling mapping from ontological states to carrier states. We introduce a novel metric, Derivation Entropy, which quantifies the effective work required to compute a target state from a given logical depth. By analyzing the interplay between Shannon entropy (storage) and computational complexity (time/energy), we demonstrate the existence of a critical phase transition point. Below this threshold, memory retrieval is thermodynamically favorable; above it, generative computation becomes the optimal strategy. This "Energy-Time-Space" conservation law provides a physical explanation for the efficiency of generative models and offers a rigorous mathematical bound for designing next-generation, energy-efficient AI architectures. Our findings suggest that the minimization of Derivation Entropy is a governing principle for the evolution of both biological and artificial intelligence.
Similar Papers
Information Physics of Intelligence: Unifying Logical Depth and Entropy under Thermodynamic Constraints
Information Theory
Makes AI smarter by saving energy and time.
A thermoinformational formulation for the description of neuropsychological systems
Neurons and Cognition
Measures how systems change and learn.
Information Efficiency of Scientific Automation
Information Theory
Makes scientific tools learn faster with less energy.