Attn-JGNN: Attention Enhanced Join-Graph Neural Networks
By: Jixin Zhang, Yong Lai
Potential Business Impact:
Solves hard math puzzles faster with smart computer learning.
We propose an Attention Enhanced Join-Graph Neural Networks(Attn-JGNN) model for solving #SAT problems, which significantly improves the solving accuracy. Inspired by the Iterative Join Graph Propagation (IJGP) algorithm, Attn-JGNN uses tree decomposition to encode the CNF formula into a join-graph, then performs iterative message passing on the join-graph, and finally approximates the model number by learning partition functions. In order to further improve the accuracy of the solution, we apply the attention mechanism in and between clusters of the join-graphs, which makes Attn-JGNN pay more attention to the key variables and clusters in probabilistic inference, and reduces the redundant calculation. Finally, our experiments show that our Attn-JGNN model achieves better results than other neural network methods.
Similar Papers
Topologic Attention Networks: Attending to Direct and Indirect Neighbors through Gaussian Belief Propagation
Machine Learning (CS)
Lets computers understand complex connections faster.
Neural Approaches to SAT Solving: Design Choices and Interpretability
Machine Learning (CS)
Helps computers solve hard puzzles faster.
Attention Beyond Neighborhoods: Reviving Transformer for Graph Clustering
Machine Learning (CS)
Helps computers group similar things by looking at connections.