Permutation Equivariant Neural Networks for Symmetric Tensors
By: Edward Pearce-Crump
Potential Business Impact:
Teaches computers to understand patterns in nature.
Incorporating permutation equivariance into neural networks has proven to be useful in ensuring that models respect symmetries that exist in data. Symmetric tensors, which naturally appear in statistics, machine learning, and graph theory, are essential for many applications in physics, chemistry, and materials science, amongst others. However, existing research on permutation equivariant models has not explored symmetric tensors as inputs, and most prior work on learning from these tensors has focused on equivariance to Euclidean groups. In this paper, we present two different characterisations of all linear permutation equivariant functions between symmetric power spaces of $\mathbb{R}^n$. We show on two tasks that these functions are highly data efficient compared to standard MLPs and have potential to generalise well to symmetric tensors of different sizes.
Similar Papers
Constructing Invariant and Equivariant Operations by Symmetric Tensor Network
Machine Learning (CS)
Makes AI understand shapes and patterns better.
A Tale of Two Symmetries: Exploring the Loss Landscape of Equivariant Models
Machine Learning (CS)
Makes smart computers learn better by fixing their rules.
Symmetry-preserving neural networks in lattice field theories
High Energy Physics - Lattice
Teaches computers to understand physics rules better.