Score: 0

Neural Compress-and-Forward for the Primitive Diamond Relay Channel

Published: December 8, 2025 | arXiv ID: 2512.07662v1

By: Ozan Aygün, Ezgi Ozyilkan, Elza Erkip

Potential Business Impact:

Helps radios send messages farther with less power.

Business Areas:
Wireless Hardware, Mobile

The diamond relay channel, where a source communicates with a destination via two parallel relays, is one of the canonical models for cooperative communications. We focus on the primitive variant, where each relay observes a noisy version of the source signal and forwards a compressed description over an orthogonal, noiseless, finite-rate link to the destination. Compress-and-forward (CF) is particularly effective in this setting, especially under oblivious relaying where relays lack access to the source codebook. While neural CF methods have been studied in single-relay channels, extending them to the two-relay case is non-trivial, as it requires fully distributed compression without any inter-relay coordination. We demonstrate that learning-based quantizers at the relays can harness input correlations by operating remote, yet in a collaborative fashion, enabling effective distributed compression in line with Berger-Tung-style coding. Each relay separately compresses its observation using a one-shot learned quantizer, and the destination jointly decodes the source message. Simulation results show that the proposed scheme, trained end-to-end with finite-order modulation, operates close to the known theoretical bounds. These results demonstrate that neural CF can scale to multi-relay systems while maintaining both performance and interpretability.

Country of Origin
🇺🇸 United States

Page Count
5 pages

Category
Computer Science:
Information Theory