Score: 0

Forward-Forward Autoencoder Architectures for Energy-Efficient Wireless Communications

Published: October 13, 2025 | arXiv ID: 2510.11418v1

By: Daniel Seifert, Onur Günlü, Rafael F. Schaefer

Potential Business Impact:

Makes computer learning faster and use less power.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

The application of deep learning to the area of communications systems has been a growing field of interest in recent years. Forward-forward (FF) learning is an efficient alternative to the backpropagation (BP) algorithm, which is the typically used training procedure for neural networks. Among its several advantages, FF learning does not require the communication channel to be differentiable and does not rely on the global availability of partial derivatives, allowing for an energy-efficient implementation. In this work, we design end-to-end learned autoencoders using the FF algorithm and numerically evaluate their performance for the additive white Gaussian noise and Rayleigh block fading channels. We demonstrate their competitiveness with BP-trained systems in the case of joint coding and modulation, and in a scenario where a fixed, non-differentiable modulation stage is applied. Moreover, we provide further insights into the design principles of the FF network, its training convergence behavior, and significant memory and processing time savings compared to BP-based approaches.

Country of Origin
🇩🇪 Germany

Page Count
6 pages

Category
Computer Science:
Information Theory