Error Exponents for Randomised List Decoding
By: Henrique K. Miyamoto, Sheng Yang
This paper studies random-coding error exponents of randomised list decoding, in which the decoder randomly selects $L$ messages with probabilities proportional to the decoding metric of the codewords. The exponents (or bounds) are given for mismatched, and then particularised to matched and universal decoding metrics. Two regimes are studied: for fixed list size, we derive an ensemble-tight random-coding error exponent, and show that, for the matched metric, it does not improve the error exponent of ordinary decoding. For list sizes growing exponentially with the block-length, we provide a non-trivial lower bound to the error exponent that is tight at high rates under the matched metric.
Similar Papers
Achievable Rates and Error Exponents for a Class of Mismatched Compound Channels
Information Theory
Improves how computers guess messages with bad info.
Exponential Error Bounds for Information Bottleneck Source Coding Problems
Information Theory
Helps send messages with less errors.
On the List-Decodability of Random (Linear) Sum-Rank Metric Codes
Information Theory
Makes data recovery from errors more reliable.