Score: 0

Achievable Rates and Error Exponents for a Class of Mismatched Compound Channels

Published: May 26, 2025 | arXiv ID: 2505.20523v1

By: Priyanka Patel, Francesc Molina, Albert Guillén i Fàbregas

Potential Business Impact:

Improves how computers guess messages with bad info.

Business Areas:
Telecommunications Hardware

This paper investigates achievable information rates and error exponents of mismatched decoding when the channel belongs to the class of channels that are close to the decoding metric in terms of relative entropy. For both discrete- and continuous-alphabet channels, we derive approximations of the worst-case achievable information rates and error exponents as a function of the radius of a small relative entropy ball centered at the decoding metric, allowing the characterization of the loss incurred due to imperfect channel estimation. We provide a number of examples including symmetric metrics and modulo- additive noise metrics for discrete systems, and nearest neighbor decoding for continuous-alphabet channels, where we derive the approximation when the channel admits arbitrary statistics and when it is assumed noise-additive with unknown finite second-order moment.

Country of Origin
🇬🇧 United Kingdom

Page Count
17 pages

Category
Computer Science:
Information Theory