A Method for Optimizing Connections in Differentiable Logic Gate Networks
By: Wout Mommen , Lars Keuninckx , Matthias Hartmann and more
Potential Business Impact:
Makes computers learn with fewer logic parts.
We introduce a novel method for partial optimization of the connections in Deep Differentiable Logic Gate Networks (LGNs). Our training method utilizes a probability distribution over a subset of connections per gate input, selecting the connection with highest merit, after which the gate-types are selected. We show that the connection-optimized LGNs outperform standard fixed-connection LGNs on the Yin-Yang, MNIST and Fashion-MNIST benchmarks, while requiring only a fraction of the number of logic gates. When training all connections, we demonstrate that 8000 simple logic gates are sufficient to achieve over 98% on the MNIST data set. Additionally, we show that our network has 24 times fewer gates, while performing better on the MNIST data set compared to standard fully connected LGNs. As such, our work shows a pathway towards fully trainable Boolean logic.
Similar Papers
Light Differentiable Logic Gate Networks
Machine Learning (CS)
Makes AI learn faster and use less memory.
Logic Gate Neural Networks are Good for Verification
Machine Learning (CS)
Makes AI easier to check for mistakes.
Mind the Gap: Removing the Discretization Gap in Differentiable Logic Gate Networks
Machine Learning (CS)
Makes smart computer pictures learn much faster.