Score: 0

Constraining the outputs of ReLU neural networks

Published: August 5, 2025 | arXiv ID: 2508.03867v1

By: Yulia Alexandr, Guido Montúfar

Potential Business Impact:

Maps how computer brains learn to think.

We introduce a class of algebraic varieties naturally associated with ReLU neural networks, arising from the piecewise linear structure of their outputs across activation regions in input space, and the piecewise multilinear structure in parameter space. By analyzing the rank constraints on the network outputs within each activation region, we derive polynomial equations that characterize the functions representable by the network. We further investigate conditions under which these varieties attain their expected dimension, providing insight into the expressive and structural properties of ReLU networks.

Page Count
32 pages

Category
Mathematics:
Algebraic Geometry