Score: 1

Distance-informed Neural Processes

Published: August 26, 2025 | arXiv ID: 2508.18903v1

By: Aishwarya Venkataramanan, Joachim Denzler

Potential Business Impact:

Helps computers know when they are unsure.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

We propose the Distance-informed Neural Process (DNP), a novel variant of Neural Processes that improves uncertainty estimation by combining global and distance-aware local latent structures. Standard Neural Processes (NPs) often rely on a global latent variable and struggle with uncertainty calibration and capturing local data dependencies. DNP addresses these limitations by introducing a global latent variable to model task-level variations and a local latent variable to capture input similarity within a distance-preserving latent space. This is achieved through bi-Lipschitz regularization, which bounds distortions in input relationships and encourages the preservation of relative distances in the latent space. This modeling approach allows DNP to produce better-calibrated uncertainty estimates and more effectively distinguish in- from out-of-distribution data. Empirical results demonstrate that DNP achieves strong predictive performance and improved uncertainty calibration across regression and classification tasks.

Country of Origin
🇩🇪 Germany

Repos / Data Links

Page Count
22 pages

Category
Computer Science:
Machine Learning (CS)