Refinements and Generalizations of the Shannon Lower Bound via Extensions of the Kraft Inequality
By: Neri Merhav
We derive a few extended versions of the Kraft inequality for lossy compression, which pave the way to the derivation of several refinements and extensions of the well known Shannon lower bound in a variety of instances of rate-distortion coding. These refinements and extensions include sharper bounds for one-to-one codes and $D$-semifaithful codes, a Shannon lower bound for distortion measures based on sliding-window functions, and an individual-sequence counterpart of the Shannon lower bound.
Similar Papers
Inequalities Revisited
Information Theory
Finds new math rules by looking at old ones.
Rate-Distortion Limits for Multimodal Retrieval: Theory, Optimal Codes, and Finite-Sample Guarantees
Information Theory
Finds best matching info from different sources.
New Bounds for Linear Codes with Applications
Information Theory
Finds better ways to send messages without errors.