Score: 0

Neural Variational Dropout Processes

Published: October 22, 2025 | arXiv ID: 2510.19425v1

By: Insu Jeon, Youngjin Park, Gunhee Kim

Potential Business Impact:

Teaches computers to learn new tasks faster.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Learning to infer the conditional posterior model is a key step for robust meta-learning. This paper presents a new Bayesian meta-learning approach called Neural Variational Dropout Processes (NVDPs). NVDPs model the conditional posterior distribution based on a task-specific dropout; a low-rank product of Bernoulli experts meta-model is utilized for a memory-efficient mapping of dropout rates from a few observed contexts. It allows for a quick reconfiguration of a globally learned and shared neural network for new tasks in multi-task few-shot learning. In addition, NVDPs utilize a novel prior conditioned on the whole task data to optimize the conditional \textit{dropout} posterior in the amortized variational inference. Surprisingly, this enables the robust approximation of task-specific dropout rates that can deal with a wide range of functional ambiguities and uncertainties. We compared the proposed method with other meta-learning approaches in the few-shot learning tasks such as 1D stochastic regression, image inpainting, and classification. The results show the excellent performance of NVDPs.

Country of Origin
🇰🇷 Korea, Republic of

Page Count
21 pages

Category
Computer Science:
Machine Learning (CS)