Symmetry-preserving neural networks in lattice field theories
By: Matteo Favoni
Potential Business Impact:
Teaches computers to understand physics rules better.
This thesis deals with neural networks that respect symmetries and presents the advantages in applying them to lattice field theory problems. The concept of equivariance is explained, together with the reason why such a property is crucial for the network to preserve the desired symmetry. The benefits of choosing equivariant networks are first illustrated for translational symmetry on a complex scalar field toy model. The discussion is then extended to gauge theories, for which Lattice Gauge Equivariant Convolutional Neural Networks (L-CNNs) are specifically designed ad hoc. Regressions of physical observables such as Wilson loops are successfully solved by L-CNNs, whereas traditional architectures which are not gauge symmetric perform significantly worse. Finally, we introduce the technique of neural gradient flow, which is an ordinary differential equation solved by neural networks, and propose it as a method to generate lattice gauge configurations.
Similar Papers
A Tale of Two Symmetries: Exploring the Loss Landscape of Equivariant Models
Machine Learning (CS)
Makes smart computers learn better by fixing their rules.
Permutation Equivariant Neural Networks for Symmetric Tensors
Machine Learning (CS)
Teaches computers to understand patterns in nature.
Lie Group Symmetry Discovery and Enforcement Using Vector Fields
Machine Learning (Stat)
Teaches computers to find hidden patterns faster.