Score: 0

The Geometry of ReLU Networks through the ReLU Transition Graph

Published: May 16, 2025 | arXiv ID: 2505.11692v2

By: Sahil Rajesh Dhayalkar

Potential Business Impact:

Maps how computer brains learn to work better.

Business Areas:
Power Grid Energy

We develop a novel theoretical framework for analyzing ReLU neural networks through the lens of a combinatorial object we term the ReLU Transition Graph (RTG). In this graph, each node corresponds to a linear region induced by the network's activation patterns, and edges connect regions that differ by a single neuron flip. Building on this structure, we derive a suite of new theoretical results connecting RTG geometry to expressivity, generalization, and robustness. Our contributions include tight combinatorial bounds on RTG size and diameter, a proof of RTG connectivity, and graph-theoretic interpretations of VC-dimension. We also relate entropy and average degree of the RTG to generalization error. Each theoretical result is rigorously validated via carefully controlled experiments across varied network depths, widths, and data regimes. This work provides the first unified treatment of ReLU network structure via graph theory and opens new avenues for compression, regularization, and complexity control rooted in RTG analysis.

Page Count
13 pages

Category
Computer Science:
Machine Learning (CS)