Score: 1

Drawback of Enforcing Equivariance and its Compensation via the Lens of Expressive Power

Published: December 10, 2025 | arXiv ID: 2512.09673v1

By: Yuzhu Chen , Tian Qin , Xinmei Tian and more

Potential Business Impact:

Makes smart computer programs learn better with less data.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Equivariant neural networks encode symmetry as an inductive bias and have achieved strong empirical performance in wide domains. However, their expressive power remains not well understood. Focusing on 2-layer ReLU networks, this paper investigates the impact of equivariance constraints on the expressivity of equivariant and layer-wise equivariant networks. By examining the boundary hyperplanes and the channel vectors of ReLU networks, we construct an example showing that equivariance constraints could strictly limit expressive power. However, we demonstrate that this drawback can be compensated via enlarging the model size. Furthermore, we show that despite a larger model size, the resulting architecture could still correspond to a hypothesis space with lower complexity, implying superior generalizability for equivariant networks.

Country of Origin
πŸ‡ΈπŸ‡¬ πŸ‡¬πŸ‡§ πŸ‡¨πŸ‡³ Singapore, China, United Kingdom

Page Count
18 pages

Category
Computer Science:
Machine Learning (CS)