Achievable Rates and Error Exponents for a Class of Mismatched Compound Channels
By: Priyanka Patel, Francesc Molina, Albert Guillén i Fàbregas
Potential Business Impact:
Improves how computers guess messages with bad info.
This paper investigates achievable information rates and error exponents of mismatched decoding when the channel belongs to the class of channels that are close to the decoding metric in terms of relative entropy. For both discrete- and continuous-alphabet channels, we derive approximations of the worst-case achievable information rates and error exponents as a function of the radius of a small relative entropy ball centered at the decoding metric, allowing the characterization of the loss incurred due to imperfect channel estimation. We provide a number of examples including symmetric metrics and modulo- additive noise metrics for discrete systems, and nearest neighbor decoding for continuous-alphabet channels, where we derive the approximation when the channel admits arbitrary statistics and when it is assumed noise-additive with unknown finite second-order moment.
Similar Papers
Achievable Rates and Error Probability Bounds of Frequency-based Channels of Unlimited Input Resolution
Information Theory
Stores more information in tiny DNA strands.
Optimal and Suboptimal Decoders under Finite-Alphabet Interference: A Mismatched Decoding Perspective
Information Theory
Improves wireless signals by better handling interference.
On the Error Exponent Distribution of Code Ensembles over Classical-Quantum Channels
Information Theory
Makes secret messages harder to steal.