Similarity-Sensitive Entropy: Induced Kernels and Data-Processing Inequalities
By: Joseph Samuel Miller
Potential Business Impact:
Measures how well information is kept when simplified.
We study an entropy functional $H_K$ that is sensitive to a prescribed similarity structure on a state space. For finite spaces, $H_K$ coincides with the order-1 similarity-sensitive entropy of Leinster and Cobbold. We work in the general measure-theoretic setting of kernelled probability spaces $(Ω,μ,K)$ introduced by Leinster and Roff, and develop basic structural properties of $H_K$. Our main results concern the behavior of $H_K$ under coarse-graining. For a measurable map $f:Ω\to Y$ and input law $μ$, we define a law-induced kernel on $Y$ whose pullback minimally dominates $K$, and show that it yields a coarse-graining inequality and a data-processing inequality for $H_K$, for both deterministic maps and general Markov kernels. We also introduce conditional similarity-sensitive entropy and an associated mutual information, and compare their behavior to the classical Shannon case.
Similar Papers
An Information-Theoretic Route to Isoperimetric Inequalities via Heat Flow and Entropy Dissipation
Differential Geometry
Measures how fast shapes shrink using information.
Structural Properties of Entropic Vectors and Stability of the Ingleton Inequality
Information Theory
Makes information sharing more secure and reliable.
Inequalities Revisited
Information Theory
Finds new math rules by looking at old ones.