Distance-informed Neural Processes
By: Aishwarya Venkataramanan, Joachim Denzler
Potential Business Impact:
Helps computers know when they are unsure.
We propose the Distance-informed Neural Process (DNP), a novel variant of Neural Processes that improves uncertainty estimation by combining global and distance-aware local latent structures. Standard Neural Processes (NPs) often rely on a global latent variable and struggle with uncertainty calibration and capturing local data dependencies. DNP addresses these limitations by introducing a global latent variable to model task-level variations and a local latent variable to capture input similarity within a distance-preserving latent space. This is achieved through bi-Lipschitz regularization, which bounds distortions in input relationships and encourages the preservation of relative distances in the latent space. This modeling approach allows DNP to produce better-calibrated uncertainty estimates and more effectively distinguish in- from out-of-distribution data. Empirical results demonstrate that DNP achieves strong predictive performance and improved uncertainty calibration across regression and classification tasks.
Similar Papers
Neural Bridge Processes
Machine Learning (CS)
Helps computers learn complex patterns from messy data.
Uncertainty-aware Accurate Elevation Modeling for Off-road Navigation via Neural Processes
Robotics
Helps robots drive safely over bumpy ground.
Uncertainty-aware Accurate Elevation Modeling for Off-road Navigation via Neural Processes
Robotics
Helps robots drive safely over bumpy ground.