Score: 0

Information Physics of Intelligence: Unifying Logical Depth and Entropy under Thermodynamic Constraints

Published: November 24, 2025 | arXiv ID: 2511.19156v2

By: Jianfeng Xu, Zeyan Li

Potential Business Impact:

Makes AI smarter and use less energy.

Business Areas:
Intelligent Systems Artificial Intelligence, Data and Analytics, Science and Engineering

The rapid scaling of artificial intelligence models has revealed a fundamental tension between model capacity (storage) and inference efficiency (computation). While classical information theory focuses on transmission and storage limits, it lacks a unified physical framework to quantify the thermodynamic costs of generating information from compressed laws versus retrieving it from memory. In this paper, we propose a theoretical framework that treats information processing as an enabling mapping from ontological states to carrier states. We introduce a novel metric, Derivation Entropy, which quantifies the effective work required to compute a target state from a given logical depth. By analyzing the interplay between Shannon entropy (storage) and computational complexity (time/energy), we demonstrate the existence of a critical phase transition point. Below this threshold, memory retrieval is thermodynamically favorable; above it, generative computation becomes the optimal strategy. This "Energy-Time-Space" conservation law provides a physical explanation for the efficiency of generative models and offers a rigorous mathematical bound for designing next-generation, energy-efficient AI architectures. Our findings suggest that the minimization of Derivation Entropy is a governing principle for the evolution of both biological and artificial intelligence.

Country of Origin
🇨🇳 China

Page Count
17 pages

Category
Computer Science:
Information Theory