Score: 0

Generalization Bounds for Transformer Channel Decoders

Published: January 11, 2026 | arXiv ID: 2601.06969v1

By: Qinshan Zhang , Bin Chen , Yong Jiang and more

Potential Business Impact:

Makes wireless signals more reliable and error-free.

Business Areas:
Telecommunications Hardware

Transformer channel decoders, such as the Error Correction Code Transformer (ECCT), have shown strong empirical performance in channel decoding, yet their generalization behavior remains theoretically unclear. This paper studies the generalization performance of ECCT from a learning-theoretic perspective. By establishing a connection between multiplicative noise estimation errors and bit-error-rate (BER), we derive an upper bound on the generalization gap via bit-wise Rademacher complexity. The resulting bound characterizes the dependence on code length, model parameters, and training set size, and applies to both single-layer and multi-layer ECCTs. We further show that parity-check-based masked attention induces sparsity that reduces the covering number, leading to a tighter generalization bound. To the best of our knowledge, this work provides the first theoretical generalization guarantees for this class of decoders.

Country of Origin
🇨🇳 China

Page Count
18 pages

Category
Computer Science:
Information Theory