Assessing the performance of correlation-based multi-fidelity neural emulators
By: Cristian J. Villatoro, Gianluca Geraci, Daniele E. Schiavazzi
Potential Business Impact:
Makes slow computer models faster and smarter.
Outer loop tasks such as optimization, uncertainty quantification or inference can easily become intractable when the underlying high-fidelity model is computationally expensive. Similarly, data-driven architectures typically require large datasets to perform predictive tasks with sufficient accuracy. A possible approach to mitigate these challenges is the development of multi-fidelity emulators, leveraging potentially biased, inexpensive low-fidelity information while correcting and refining predictions using scarce, accurate high-fidelity data. This study investigates the performance of multi-fidelity neural emulators, neural networks designed to learn the input-to-output mapping by integrating limited high-fidelity data with abundant low-fidelity model solutions. We investigate the performance of such emulators for low and high-dimensional functions, with oscillatory character, in the presence of discontinuities, for collections of models with equal and dissimilar parametrization, and for a possibly large number of potentially corrupted low-fidelity sources. In doing so, we consider a large number of architectural, hyperparameter, and dataset configurations including networks with a different amount of spectral bias (Multi-Layered Perceptron, Siren and Kolmogorov Arnold Network), various mechanisms for coordinate encoding, exact or learnable low-fidelity information, and for varying training dataset size. We further analyze the added value of the multi-fidelity approach by conducting equivalent single-fidelity tests for each case, quantifying the performance gains achieved through fusing multiple sources of information.
Similar Papers
Progressive multi-fidelity learning for physical system predictions
Machine Learning (CS)
Makes computer guesses better with mixed data.
On the performance of multi-fidelity and reduced-dimensional neural emulators for inference of physiologic boundary conditions
Machine Learning (Stat)
Makes heart models run faster for better health.
On Some Tunable Multi-fidelity Bayesian Optimization Frameworks
Machine Learning (CS)
Finds best designs using less expensive tests.