Score: 1

Equivariant Representation Learning for Symmetry-Aware Inference with Guarantees

Published: May 26, 2025 | arXiv ID: 2505.19809v2

By: Daniel Ordoñez-Apraez , Vladimir Kostić , Alek Fröhlich and more

Potential Business Impact:

Teaches computers to learn from fewer examples.

Business Areas:
A/B Testing Data and Analytics

In many real-world applications of regression, conditional probability estimation, and uncertainty quantification, exploiting symmetries rooted in physics or geometry can dramatically improve generalization and sample efficiency. While geometric deep learning has made significant empirical advances by incorporating group-theoretic structure, less attention has been given to statistical learning guarantees. In this paper, we introduce an equivariant representation learning framework that simultaneously addresses regression, conditional probability estimation, and uncertainty quantification while providing first-of-its-kind non-asymptotic statistical learning guarantees. Grounded in operator and group representation theory, our framework approximates the spectral decomposition of the conditional expectation operator, building representations that are both equivariant and disentangled along independent symmetry subgroups. Empirical evaluations on synthetic datasets and real-world robotics applications confirm the potential of our approach, matching or outperforming existing equivariant baselines in regression while additionally providing well-calibrated parametric uncertainty estimates.

Country of Origin
🇷🇸 🇫🇷 Serbia, France

Page Count
55 pages

Category
Computer Science:
Machine Learning (CS)