Score: 2

TransCoder: A Neural-Enhancement Framework for Channel Codes

Published: November 27, 2025 | arXiv ID: 2511.22539v1

By: Anastasiia Kurmukova , Selim F. Yilmaz , Emre Ozfatura and more

Potential Business Impact:

Makes wireless messages clearer, even with bad signals.

Business Areas:
Telecommunications Hardware

Reliable communication over noisy channels requires the design of specialized error-correcting codes (ECCs) tailored to specific system requirements. Recently, neural network-based decoders have emerged as promising tools for enhancing ECC reliability, yet their high computational complexity prevents their potential practical deployment. In this paper, we take a different approach and design a neural transmission scheme that employs the transformer architecture in order to improve the reliability of existing ECCs. We call this approach TransCoder, alluding both to its function and architecture. TransCoder operates as a code-adaptive neural module aimed at performance enhancement that can be implemented flexibly at either the transmitter, receiver, or both. The framework employs an iterative decoding procedure, where both noisy information from the channel and updates from the conventional ECC decoder are processed by a neural decoder block, utilizing a block attention mechanism for efficiency. Through extensive simulations with various conventional codes (LDPC, BCH, Polar, and Turbo) and across a wide range of channel conditions, we demonstrate that TransCoder significantly improves block error rate (BLER) performance while maintaining computational complexity comparable to traditional decoders. Notably, our approach is particularly effective for longer codes (block length >64) and at lower code rates, scenarios in which existing neural decoders often struggle (despite their formidable computational complexity). The results establish TransCoder as a promising practical solution for reliable communication among resource-constrained wireless devices.

Country of Origin
🇹🇷 🇬🇧 United Kingdom, Turkey

Repos / Data Links

Page Count
13 pages

Category
Computer Science:
Information Theory