Score: 0

Deep Gaussian Processes for Functional Maps

Published: October 24, 2025 | arXiv ID: 2510.22068v1

By: Matthew Lowery , Zhitong Xu , Da Long and more

Potential Business Impact:

Predicts complex patterns in messy data accurately.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Learning mappings between functional spaces, also known as function-on-function regression, plays a crucial role in functional data analysis and has broad applications, e.g. spatiotemporal forecasting, curve prediction, and climate modeling. Existing approaches, such as functional linear models and neural operators, either fall short of capturing complex nonlinearities or lack reliable uncertainty quantification under noisy, sparse, and irregularly sampled data. To address these issues, we propose Deep Gaussian Processes for Functional Maps (DGPFM). Our method designs a sequence of GP-based linear and nonlinear transformations, leveraging integral transforms of kernels, GP interpolation, and nonlinear activations sampled from GPs. A key insight simplifies implementation: under fixed locations, discrete approximations of kernel integral transforms collapse into direct functional integral transforms, enabling flexible incorporation of various integral transform designs. To achieve scalable probabilistic inference, we use inducing points and whitening transformations to develop a variational learning algorithm. Empirical results on real-world and PDE benchmark datasets demonstrate that the advantage of DGPFM in both predictive performance and uncertainty calibration.

Country of Origin
🇺🇸 United States

Page Count
23 pages

Category
Computer Science:
Machine Learning (CS)