Soft Guessing Under Logarithmic Loss Allowing Errors and Variable-Length Source Coding
By: Shota Saito, Hamdi Joudeh
Potential Business Impact:
Helps guess secrets faster with fewer mistakes.
This paper considers the problem of soft guessing under a logarithmic loss distortion measure while allowing errors. We find an optimal guessing strategy, and derive single-shot upper and lower bounds for the minimal guessing moments as well as an asymptotic expansion for i.i.d. sources. These results are extended to the case where side information is available to the guesser. Furthermore, a connection between soft guessing allowing errors and variable-length lossy source coding under logarithmic loss is demonstrated. The R\'enyi entropy, the smooth R\'enyi entropy, and their conditional versions play an important role.
Similar Papers
Exponential Error Bounds for Information Bottleneck Source Coding Problems
Information Theory
Helps send messages with less errors.
Rate-Distortion Limits for Multimodal Retrieval: Theory, Optimal Codes, and Finite-Sample Guarantees
Information Theory
Finds best matching info from different sources.
Discrete Layered Entropy, Conditional Compression and a Tighter Strong Functional Representation Lemma
Information Theory
Makes math problems about information easier.