When three experiments are better than two: Avoiding intractable correlated aleatoric uncertainty by leveraging a novel bias--variance tradeoff
By: Paul Scherer, Andreas Kirsch, Jake P. Taylor-King
Potential Business Impact:
Helps computers learn faster with noisy data.
Real-world experimental scenarios are characterized by the presence of heteroskedastic aleatoric uncertainty, and this uncertainty can be correlated in batched settings. The bias--variance tradeoff can be used to write the expected mean squared error between a model distribution and a ground-truth random variable as the sum of an epistemic uncertainty term, the bias squared, and an aleatoric uncertainty term. We leverage this relationship to propose novel active learning strategies that directly reduce the bias between experimental rounds, considering model systems both with and without noise. Finally, we investigate methods to leverage historical data in a quadratic manner through the use of a novel cobias--covariance relationship, which naturally proposes a mechanism for batching through an eigendecomposition strategy. When our difference-based method leveraging the cobias--covariance relationship is utilized in a batched setting (with a quadratic estimator), we outperform a number of canonical methods including BALD and Least Confidence.
Similar Papers
Cooperative Bayesian and variance networks disentangle aleatoric and epistemic uncertainties
Machine Learning (CS)
Helps computers guess better when they're unsure.
The Bias-Variance Tradeoff in Long-Term Experimentation
Methodology
Improves long-term decisions by balancing precision and bias.
Uncertainty Estimation using Variance-Gated Distributions
Machine Learning (CS)
Makes AI more sure about its answers.