Progressive multi-fidelity learning for physical system predictions
By: Paolo Conti , Mengwu Guo , Attilio Frangi and more
Potential Business Impact:
Makes computer guesses better with mixed data.
Highly accurate datasets from numerical or physical experiments are often expensive and time-consuming to acquire, posing a significant challenge for applications that require precise evaluations, potentially across multiple scenarios and in real-time. Even building sufficiently accurate surrogate models can be extremely challenging with limited high-fidelity data. Conversely, less expensive, low-fidelity data can be computed more easily and encompass a broader range of scenarios. By leveraging multi-fidelity information, prediction capabilities of surrogates can be improved. However, in practical situations, data may be different in types, come from sources of different modalities, and not be concurrently available, further complicating the modeling process. To address these challenges, we introduce a progressive multi-fidelity surrogate model. This model can sequentially incorporate diverse data types using tailored encoders. Multi-fidelity regression from the encoded inputs to the target quantities of interest is then performed using neural networks. Input information progressively flows from lower to higher fidelity levels through two sets of connections: concatenations among all the encoded inputs, and additive connections among the final outputs. This dual connection system enables the model to exploit correlations among different datasets while ensuring that each level makes an additive correction to the previous level without altering it. This approach prevents performance degradation as new input data are integrated into the model and automatically adapts predictions based on the available inputs. We demonstrate the effectiveness of the approach on numerical benchmarks and a real-world case study, showing that it reliably integrates multi-modal data and provides accurate predictions, maintaining performance when generalizing across time and parameter variations.
Similar Papers
Assessing the performance of correlation-based multi-fidelity neural emulators
Machine Learning (CS)
Makes slow computer models faster and smarter.
Projection-based multifidelity linear regression for data-scarce applications
Machine Learning (Stat)
Makes complex computer models faster and cheaper.
On Some Tunable Multi-fidelity Bayesian Optimization Frameworks
Machine Learning (CS)
Finds best designs using less expensive tests.