Neural Variational Dropout Processes
By: Insu Jeon, Youngjin Park, Gunhee Kim
Potential Business Impact:
Teaches computers to learn new tasks faster.
Learning to infer the conditional posterior model is a key step for robust meta-learning. This paper presents a new Bayesian meta-learning approach called Neural Variational Dropout Processes (NVDPs). NVDPs model the conditional posterior distribution based on a task-specific dropout; a low-rank product of Bernoulli experts meta-model is utilized for a memory-efficient mapping of dropout rates from a few observed contexts. It allows for a quick reconfiguration of a globally learned and shared neural network for new tasks in multi-task few-shot learning. In addition, NVDPs utilize a novel prior conditioned on the whole task data to optimize the conditional \textit{dropout} posterior in the amortized variational inference. Surprisingly, this enables the robust approximation of task-specific dropout rates that can deal with a wide range of functional ambiguities and uncertainties. We compared the proposed method with other meta-learning approaches in the few-shot learning tasks such as 1D stochastic regression, image inpainting, and classification. The results show the excellent performance of NVDPs.
Similar Papers
Neural Bridge Processes
Machine Learning (CS)
Helps computers learn complex patterns from messy data.
Federated Learning via Meta-Variational Dropout
Machine Learning (CS)
Helps AI learn from private data better.
Variational Adaptive Noise and Dropout towards Stable Recurrent Neural Networks
Machine Learning (CS)
Teaches robots to learn and copy actions.