Information-Theoretic Equivalences Across Rate-Distortion, Quantization, and Decoding
By: Bruno Macchiavello
We propose a unified mathematical framework for rate-distortion theory, lattice quantization, and modern error-correcting codes by emphasizing their variational and convex-analytic structure. First, we establish a Gibbs-type variational formulation of the rate-distortion function and show that optimal test channels form an exponential family, with Fullback-Leibler divergence acting as a Bregman divergence. This yields a generalized Pythagorean theorem for projections and a Legendre duality that couples distortion constraints with inverse temperature parameters. Second, the reverse water-filling metaphor is extended to distributed lattice quantization, deriving distortion allocation bounds across eigenmodes of conditional covariance matrices. Third, inference is formalized as decoding by showing that belief propagation in LDPC ensembles and polarization in polar codes can be interpreted as recursive variational inference procedures. These results unify compression, quantization, and decoding as convex projections of continuous information onto discrete manifolds. Extensions to neural compression and quantum information are sketched as corollaries, illustrating the universality of the framework. Illustrative connections to other scientific fields are also presented. Finally, complementary numerical examples and scripts are located in the appendix
Similar Papers
Computability of the Optimizer for Rate Distortion Functions
Information Theory
Makes data smaller, but finding the best way is hard.
Rate-Distortion Limits for Multimodal Retrieval: Theory, Optimal Codes, and Finite-Sample Guarantees
Information Theory
Finds best matching info from different sources.
Foundations of information theory for coding theory
Information Theory
Makes messages travel safely through noisy signals.