Permutation-Invariant Spectral Learning via Dyson Diffusion
By: Tassilo Schwarz , Cai Dieball , Constantin Kogler and more
Potential Business Impact:
**Builds better computer models of connections.**
Diffusion models are central to generative modeling and have been adapted to graphs by diffusing adjacency matrix representations. The challenge of having up to $n!$ such representations for graphs with $n$ nodes is only partially mitigated by using permutation-equivariant learning architectures. Despite their computational efficiency, existing graph diffusion models struggle to distinguish certain graph families, unless graph data are augmented with ad hoc features. This shortcoming stems from enforcing the inductive bias within the learning architecture. In this work, we leverage random matrix theory to analytically extract the spectral properties of the diffusion process, allowing us to push the inductive bias from the architecture into the dynamics. Building on this, we introduce the Dyson Diffusion Model, which employs Dyson's Brownian Motion to capture the spectral dynamics of an Ornstein-Uhlenbeck process on the adjacency matrix while retaining all non-spectral information. We demonstrate that the Dyson Diffusion Model learns graph spectra accurately and outperforms existing graph diffusion models.
Similar Papers
Graph Representation Learning with Diffusion Generative Models
Machine Learning (CS)
Helps computers understand and learn from connected data.
The Information Dynamics of Generative Diffusion
Machine Learning (Stat)
Makes AI create new things by breaking symmetries.
The Information Dynamics of Generative Diffusion
Machine Learning (Stat)
Makes AI create new things by breaking symmetries.