Equivariant Representation Learning for Symmetry-Aware Inference with Guarantees
By: Daniel Ordoñez-Apraez , Vladimir Kostić , Alek Fröhlich and more
Potential Business Impact:
Teaches computers to learn from fewer examples.
In many real-world applications of regression, conditional probability estimation, and uncertainty quantification, exploiting symmetries rooted in physics or geometry can dramatically improve generalization and sample efficiency. While geometric deep learning has made significant empirical advances by incorporating group-theoretic structure, less attention has been given to statistical learning guarantees. In this paper, we introduce an equivariant representation learning framework that simultaneously addresses regression, conditional probability estimation, and uncertainty quantification while providing first-of-its-kind non-asymptotic statistical learning guarantees. Grounded in operator and group representation theory, our framework approximates the spectral decomposition of the conditional expectation operator, building representations that are both equivariant and disentangled along independent symmetry subgroups. Empirical evaluations on synthetic datasets and real-world robotics applications confirm the potential of our approach, matching or outperforming existing equivariant baselines in regression while additionally providing well-calibrated parametric uncertainty estimates.
Similar Papers
A Regularization-Guided Equivariant Approach for Image Restoration
CV and Pattern Recognition
Fixes blurry pictures by understanding shapes.
Approximate equivariance via projection-based regularisation
Machine Learning (CS)
Makes AI learn faster and more accurately.
Data Augmentation and Regularization for Learning Group Equivariance
Machine Learning (Stat)
Teaches computers to learn from slightly changed pictures.