Redefining Information Theory: From Quantization and Rate--Distortion to a Foundational Mathematical Framework
By: Bruno Macchiavello
Potential Business Impact:
Makes all math a simple code of 0s and 1s.
This paper redefines information theory as a foundational mathematical discipline, extending beyond its traditional role in engineering applications. Building on Shannon's entropy, rate'--distortion theory, and Wyner'--Ziv coding, we show that all optimization methods can be interpreted as projections of continuous information onto discrete binary spaces. Numbers are not intrinsic carriers of meaning but codes of information, with binary digits (0 and 1) serving as universal symbols sufficient for all mathematical structures. Rate'--distortion optimization via Lagrangian multipliers connects quantization error directly to fundamental limits of representation, while Wyner'--Ziv coding admits a path integral interpretation over probability manifolds, unifying quantization, inference, geometry, and error. We further extend this framework into category theory, topological data analysis, and universal coding, situating computation and game theory as complementary perspectives. The result is a set of postulates that elevate information theory to the status of a universal mathematical language.
Similar Papers
Information-Theoretic Equivalences Across Rate-Distortion, Quantization, and Decoding
Information Theory
Makes data compression and error correction work together.
Foundations of information theory for coding theory
Information Theory
Makes messages travel safely through noisy signals.
Computability of the Optimizer for Rate Distortion Functions
Information Theory
Makes data smaller, but finding the best way is hard.