Score: 3

VIKING: Deep variational inference with stochastic projections

Published: October 27, 2025 | arXiv ID: 2510.23684v1

By: Samuel G. Fadel , Hrittik Roy , Nicholas Krämer and more

Potential Business Impact:

Makes smart computer programs more accurate and reliable.

Business Areas:
A/B Testing Data and Analytics

Variational mean field approximations tend to struggle with contemporary overparametrized deep neural networks. Where a Bayesian treatment is usually associated with high-quality predictions and uncertainties, the practical reality has been the opposite, with unstable training, poor predictive power, and subpar calibration. Building upon recent work on reparametrizations of neural networks, we propose a simple variational family that considers two independent linear subspaces of the parameter space. These represent functional changes inside and outside the support of training data. This allows us to build a fully-correlated approximate posterior reflecting the overparametrization that tunes easy-to-interpret hyperparameters. We develop scalable numerical routines that maximize the associated evidence lower bound (ELBO) and sample from the approximate posterior. Empirically, we observe state-of-the-art performance across tasks, models, and datasets compared to a wide array of baseline methods. Our results show that approximate Bayesian inference applied to deep neural networks is far from a lost cause when constructing inference mechanisms that reflect the geometry of reparametrizations.

Country of Origin
🇩🇰 🇬🇧 Denmark, United Kingdom

Repos / Data Links

Page Count
26 pages

Category
Statistics:
Machine Learning (Stat)