Conditional Normalizing Flows for Forward and Backward Joint State and Parameter Estimation
By: Luke S. Lagunowich, Guoxiang Grayson Tong, Daniele E. Schiavazzi
Potential Business Impact:
Helps self-driving cars and doctors predict better.
Traditional filtering algorithms for state estimation -- such as classical Kalman filtering, unscented Kalman filtering, and particle filters - show performance degradation when applied to nonlinear systems whose uncertainty follows arbitrary non-Gaussian, and potentially multi-modal distributions. This study reviews recent approaches to state estimation via nonlinear filtering based on conditional normalizing flows, where the conditional embedding is generated by standard MLP architectures, transformers or selective state-space models (like Mamba-SSM). In addition, we test the effectiveness of an optimal-transport-inspired kinetic loss term in mitigating overparameterization in flows consisting of a large collection of transformations. We investigate the performance of these approaches on applications relevant to autonomous driving and patient population dynamics, paying special attention to how they handle time inversion and chained predictions. Finally, we assess the performance of various conditioning strategies for an application to real-world COVID-19 joint SIR system forecasting and parameter estimation.
Similar Papers
Contrastive Normalizing Flows for Uncertainty-Aware Parameter Estimation
Data Analysis, Statistics and Probability
Finds hidden clues in science data.
Conditional Normalizing Flow Surrogate for Monte Carlo Prediction of Radiative Properties in Nanoparticle-Embedded Layers
Machine Learning (Stat)
Predicts how light bends and bounces through materials.
Bidirectional Normalizing Flow: From Data to Noise and Back
Machine Learning (CS)
Makes AI create better pictures faster.