Score: 1

On Uncertainty Calibration for Equivariant Functions

Published: October 24, 2025 | arXiv ID: 2510.21691v2

By: Edward Berman , Jacob Ginesin , Marco Pacini and more

Potential Business Impact:

Makes AI better at learning from less data.

Business Areas:
Quantum Computing Science and Engineering

Data-sparse settings such as robotic manipulation, molecular physics, and galaxy morphology classification are some of the hardest domains for deep learning. For these problems, equivariant networks can help improve modeling across undersampled parts of the input space, and uncertainty estimation can guard against overconfidence. However, until now, the relationships between equivariance and model confidence, and more generally equivariance and model calibration, has yet to be studied. Since traditional classification and regression error terms show up in the definitions of calibration error, it is natural to suspect that previous work can be used to help understand the relationship between equivariance and calibration error. In this work, we present a theory relating equivariance to uncertainty estimation. By proving lower and upper bounds on uncertainty calibration errors (ECE and ENCE) under various equivariance conditions, we elucidate the generalization limits of equivariant models and illustrate how symmetry mismatch can result in miscalibration in both classification and regression. We complement our theoretical framework with numerical experiments that clarify the relationship between equivariance and uncertainty using a variety of real and simulated datasets, and we comment on trends with symmetry mismatch, group size, and aleatoric and epistemic uncertainties.

Country of Origin
🇺🇸 United States

Repos / Data Links

Page Count
45 pages

Category
Computer Science:
Machine Learning (CS)