Score: 0

KerJEPA: Kernel Discrepancies for Euclidean Self-Supervised Learning

Published: December 22, 2025 | arXiv ID: 2512.19605v1

By: Eric Zimmermann , Harley Wiltzer , Justin Szeto and more

Recent breakthroughs in self-supervised Joint-Embedding Predictive Architectures (JEPAs) have established that regularizing Euclidean representations toward isotropic Gaussian priors yields provable gains in training stability and downstream generalization. We introduce a new, flexible family of KerJEPAs, self-supervised learning algorithms with kernel-based regularizers. One instance of this family corresponds to the recently-introduced LeJEPA Epps-Pulley regularizer which approximates a sliced maximum mean discrepancy (MMD) with a Gaussian prior and Gaussian kernel. By expanding the class of viable kernels and priors and computing the closed-form high-dimensional limit of sliced MMDs, we develop alternative KerJEPAs with a number of favorable properties including improved training stability and design flexibility.

Category
Computer Science:
Machine Learning (CS)