Recurrent Deep Differentiable Logic Gate Networks
By: Simon Bührer , Andreas Plesner , Till Aczel and more
Potential Business Impact:
Makes computers learn by thinking like simple switches.
While differentiable logic gates have shown promise in feedforward networks, their application to sequential modeling remains unexplored. This paper presents the first implementation of Recurrent Deep Differentiable Logic Gate Networks (RDDLGN), combining Boolean operations with recurrent architectures for sequence-to-sequence learning. Evaluated on WMT'14 English-German translation, RDDLGN achieves 5.00 BLEU and 30.9\% accuracy during training, approaching GRU performance (5.41 BLEU) and graceful degradation (4.39 BLEU) during inference. This work establishes recurrent logic-based neural computation as viable, opening research directions for FPGA acceleration in sequential modeling and other recursive network architectures.
Similar Papers
Light Differentiable Logic Gate Networks
Machine Learning (CS)
Makes AI learn faster and use less memory.
DynamicRTL: RTL Representation Learning for Dynamic Circuit Behavior
Machine Learning (CS)
Helps computers understand how circuits work over time.
Differentiable Logic Cellular Automata: From Game of Life to Pattern Generation
Artificial Intelligence
Teaches computers to make complex patterns grow.