Deep Gaussian Processes for Functional Maps
By: Matthew Lowery , Zhitong Xu , Da Long and more
Potential Business Impact:
Predicts complex patterns in messy data accurately.
Learning mappings between functional spaces, also known as function-on-function regression, plays a crucial role in functional data analysis and has broad applications, e.g. spatiotemporal forecasting, curve prediction, and climate modeling. Existing approaches, such as functional linear models and neural operators, either fall short of capturing complex nonlinearities or lack reliable uncertainty quantification under noisy, sparse, and irregularly sampled data. To address these issues, we propose Deep Gaussian Processes for Functional Maps (DGPFM). Our method designs a sequence of GP-based linear and nonlinear transformations, leveraging integral transforms of kernels, GP interpolation, and nonlinear activations sampled from GPs. A key insight simplifies implementation: under fixed locations, discrete approximations of kernel integral transforms collapse into direct functional integral transforms, enabling flexible incorporation of various integral transform designs. To achieve scalable probabilistic inference, we use inducing points and whitening transformations to develop a variational learning algorithm. Empirical results on real-world and PDE benchmark datasets demonstrate that the advantage of DGPFM in both predictive performance and uncertainty calibration.
Similar Papers
Deep Jump Gaussian Processes for Surrogate Modeling of High-Dimensional Piecewise Continuous Functions
Machine Learning (CS)
Helps computers learn complex patterns faster.
Functional Mean Flow in Hilbert Space
Machine Learning (CS)
Creates new data like pictures or sounds quickly.
Robust, Online, and Adaptive Decentralized Gaussian Processes
Machine Learning (Stat)
Makes computer models work better with messy data.