Tubular Riemannian Laplace Approximations for Bayesian Neural Networks
By: Rodrigo Pereira David
Laplace approximations are among the simplest and most practical methods for approximate Bayesian inference in neural networks, yet their Euclidean formulation struggles with the highly anisotropic, curved loss surfaces and large symmetry groups that characterize modern deep models. Recent work has proposed Riemannian and geometric Gaussian approximations to adapt to this structure. Building on these ideas, we introduce the Tubular Riemannian Laplace (TRL) approximation. TRL explicitly models the posterior as a probabilistic tube that follows a low-loss valley induced by functional symmetries, using a Fisher/Gauss-Newton metric to separate prior-dominated tangential uncertainty from data-dominated transverse uncertainty. We interpret TRL as a scalable reparametrised Gaussian approximation that utilizes implicit curvature estimates to operate in high-dimensional parameter spaces. Our empirical evaluation on ResNet-18 (CIFAR-10 and CIFAR-100) demonstrates that TRL achieves excellent calibration, matching or exceeding the reliability of Deep Ensembles (in terms of ECE) while requiring only a fraction (1/5) of the training cost. TRL effectively bridges the gap between single-model efficiency and ensemble-grade reliability.
Similar Papers
Laplace Approximation For Tensor Train Kernel Machines In System Identification
Machine Learning (Stat)
Trains computers faster for complex predictions.
Walking on the Fiber: A Simple Geometric Approximation for Bayesian Neural Networks
Machine Learning (CS)
Lets computers learn with less guessing.
Neural Local Wasserstein Regression
Machine Learning (Stat)
Teaches computers to understand complex data patterns.