Score: 1

A Method for Optimizing Connections in Differentiable Logic Gate Networks

Published: July 8, 2025 | arXiv ID: 2507.06173v1

By: Wout Mommen , Lars Keuninckx , Matthias Hartmann and more

Potential Business Impact:

Makes computers learn with fewer logic parts.

Business Areas:
A/B Testing Data and Analytics

We introduce a novel method for partial optimization of the connections in Deep Differentiable Logic Gate Networks (LGNs). Our training method utilizes a probability distribution over a subset of connections per gate input, selecting the connection with highest merit, after which the gate-types are selected. We show that the connection-optimized LGNs outperform standard fixed-connection LGNs on the Yin-Yang, MNIST and Fashion-MNIST benchmarks, while requiring only a fraction of the number of logic gates. When training all connections, we demonstrate that 8000 simple logic gates are sufficient to achieve over 98% on the MNIST data set. Additionally, we show that our network has 24 times fewer gates, while performing better on the MNIST data set compared to standard fully connected LGNs. As such, our work shows a pathway towards fully trainable Boolean logic.

Country of Origin
πŸ‡§πŸ‡ͺ Belgium

Page Count
4 pages

Category
Computer Science:
Machine Learning (CS)