Score: 0

Uncertainty quantification of neural network models of evolving processes via Langevin sampling

Published: April 21, 2025 | arXiv ID: 2504.14854v2

By: Cosmin Safta , Reese E. Jones , Ravi G. Patel and more

Potential Business Impact:

Helps predict how things change over time.

Business Areas:
Simulation Software

We propose a scalable, approximate inference hypernetwork framework for a general model of history-dependent processes. The flexible data model is based on a neural ordinary differential equation (NODE) representing the evolution of internal states together with a trainable observation model subcomponent. The posterior distribution corresponding to the data model parameters (weights and biases) follows a stochastic differential equation with a drift term related to the score of the posterior that is learned jointly with the data model parameters. This Langevin sampling approach offers flexibility in balancing the computational budget between the evaluation cost of the data model and the approximation of the posterior density of its parameters. We demonstrate performance of the ensemble sampling hypernetwork on chemical reaction and material physics data and compare it to standard variational inference.

Page Count
23 pages

Category
Computer Science:
Machine Learning (CS)