Complex-Weighted Convolutional Networks: Provable Expressiveness via Complex Diffusion
By: Cristina López Amado , Tassilo Schwarz , Yu Tian and more
Potential Business Impact:
Makes computer learning understand messy data better.
Graph Neural Networks (GNNs) have achieved remarkable success across diverse applications, yet they remain limited by oversmoothing and poor performance on heterophilic graphs. To address these challenges, we introduce a novel framework that equips graphs with a complex-weighted structure, assigning each edge a complex number to drive a diffusion process that extends random walks into the complex domain. We prove that this diffusion is highly expressive: with appropriately chosen complex weights, any node-classification task can be solved in the steady state of a complex random walk. Building on this insight, we propose the Complex-Weighted Convolutional Network (CWCN), which learns suitable complex-weighted structures directly from data while enriching diffusion with learnable matrices and nonlinear activations. CWCN is simple to implement, requires no additional hyperparameters beyond those of standard GNNs, and achieves competitive performance on benchmark datasets. Our results demonstrate that complex-weighted diffusion provides a principled and general mechanism for enhancing GNN expressiveness, opening new avenues for models that are both theoretically grounded and practically effective.
Similar Papers
On the Computational Capability of Graph Neural Networks: A Circuit Complexity Bound Perspective
Machine Learning (CS)
Computers struggle with some graph problems.
Hypergraph Diffusion for High-Order Recommender Systems
Information Retrieval
Finds better movies and songs you'll like.
mHC-GNN: Manifold-Constrained Hyper-Connections for Graph Neural Networks
Machine Learning (CS)
Makes computer learning models work better for longer.