Branching Stein Variational Gradient Descent for sampling multimodal distributions
By: Isaías Bañales, Arturo Jaramillo, Joshué Helí Ricalde-Guerrero
Potential Business Impact:
Helps computers find hidden patterns in complex data.
We propose a novel particle-based variational inference method designed to work with multimodal distributions. Our approach, referred to as Branched Stein Variational Gradient Descent (BSVGD), extends the classical Stein Variational Gradient Descent (SVGD) algorithm by incorporating a random branching mechanism that encourages the exploration of the state space. In this work, a theoretical guarantee for the convergence in distribution is presented, as well as numerical experiments to validate the suitability of our algorithm. Performance comparisons between the BSVGD and the SVGD are presented using the Wasserstein distance between samples and the corresponding computational times.
Similar Papers
Towards understanding Accelerated Stein Variational Gradient Flow -- Analysis of Generalized Bilinear Kernels for Gaussian target distributions
Optimization and Control
Makes computer learning faster and better.
Adaptive Kernel Selection for Stein Variational Gradient Descent
Machine Learning (Stat)
Improves computer learning by smarter guessing.
Stochastic gradient descent based variational inference for infinite-dimensional inverse problems
Numerical Analysis
Helps computers solve tricky math problems faster.