Quantifying model prediction sensitivity to model-form uncertainty
By: Teresa Portone, Rebekah D. White, Joseph L. Hart
Potential Business Impact:
Measures how much guesses affect predictions.
Model-form uncertainty (MFU) in assumptions made during physics-based model development is widely considered a significant source of uncertainty; however, there are limited approaches that can quantify MFU in predictions extrapolating beyond available data. As a result, it is challenging to know how important MFU is in practice, especially relative to other sources of uncertainty in a model, making it difficult to prioritize resources and efforts to drive down error in model predictions. To address these challenges, we present a novel method to quantify the importance of uncertainties associated with model assumptions. We combine parameterized modifications to assumptions (called MFU representations) with grouped variance-based sensitivity analysis to measure the importance of assumptions. We demonstrate how, in contrast to existing methods addressing MFU, our approach can be applied without access to calibration data. However, if calibration data is available, we demonstrate how it can be used to inform the MFU representation, and how variance-based sensitivity analysis can be meaningfully applied even in the presence of dependence between parameters (a common byproduct of calibration).
Similar Papers
Quantifying model prediction sensitivity to model-form uncertainty
Computational Engineering, Finance, and Science
Measures how much guesses affect computer predictions.
Uncertainty Quantification for Data-Driven Machine Learning Models in Nuclear Engineering Applications: Where We Are and What Do We Need?
Systems and Control
Shows how sure computers are about their answers.
Uncertainty Quantification in Probabilistic Machine Learning Models: Theory, Methods, and Insights
Machine Learning (Stat)
Helps computers know when they're unsure.