Score: 0

Recurrent Deep Differentiable Logic Gate Networks

Published: August 8, 2025 | arXiv ID: 2508.06097v1

By: Simon Bührer , Andreas Plesner , Till Aczel and more

Potential Business Impact:

Makes computers learn by thinking like simple switches.

While differentiable logic gates have shown promise in feedforward networks, their application to sequential modeling remains unexplored. This paper presents the first implementation of Recurrent Deep Differentiable Logic Gate Networks (RDDLGN), combining Boolean operations with recurrent architectures for sequence-to-sequence learning. Evaluated on WMT'14 English-German translation, RDDLGN achieves 5.00 BLEU and 30.9\% accuracy during training, approaching GRU performance (5.41 BLEU) and graceful degradation (4.39 BLEU) during inference. This work establishes recurrent logic-based neural computation as viable, opening research directions for FPGA acceleration in sequential modeling and other recursive network architectures.

Page Count
12 pages

Category
Computer Science:
Machine Learning (CS)