Score: 0

Laplace Learning in Wasserstein Space

Published: November 17, 2025 | arXiv ID: 2511.13229v1

By: Mary Chriselda Antony Oliver , Michael Roberts , Carola-Bibiane Schönlieb and more

Potential Business Impact:

Helps computers learn from messy data better.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

The manifold hypothesis posits that high-dimensional data typically resides on low-dimensional sub spaces. In this paper, we assume manifold hypothesis to investigate graph-based semi-supervised learning methods. In particular, we examine Laplace Learning in the Wasserstein space, extending the classical notion of graph-based semi-supervised learning algorithms from finite-dimensional Euclidean spaces to an infinite-dimensional setting. To achieve this, we prove variational convergence of a discrete graph p- Dirichlet energy to its continuum counterpart. In addition, we characterize the Laplace-Beltrami operator on asubmanifold of the Wasserstein space. Finally, we validate the proposed theoretical framework through numerical experiments conducted on benchmark datasets, demonstrating the consistency of our classification performance in high-dimensional settings.

Page Count
46 pages

Category
Computer Science:
Machine Learning (CS)