Mean dimension and rate-distortion function revisited
By: Rui Yang
Potential Business Impact:
Explains how to measure information in complex systems.
Around the mean dimensions and rate-distortion functions, using some tools from local entropy theory this paper establishes the following main results: $(1)$ We prove that for non-ergodic measures associated with almost sure processes, the mean R\'enyi information dimension coincides with the information dimension rate. This answers a question posed by Gutman and \'Spiewak (in Around the variational principle for metric mean dimension, \emph{Studia Math.} \textbf{261}(2021) 345-360). $(2)$ We introduce four types of rate-distortion entropies and establish their relation with Kolmogorov-Sinai entropy. $(3)$ We show that for systems with the marker property, if the mean dimension is finite, then the supremum in Lindenstrauss-Tsukamoto's double variational principle can be taken over the set of ergodic measures. Additionally, the double variational principle holds for various other measure-theoretic $\epsilon$-entropies.
Similar Papers
Rate distortion dimension and ergodic decomposition for $\mathbb{R}^d$-actions
Dynamical Systems
Makes data smaller while keeping it clear.
Optimal compressed sensing for mixing stochastic processes
Information Theory
Lets computers learn from less data.
On the Rate-Distortion-Perception Function for Gaussian Processes
Information Theory
Makes AI understand and create realistic images.