Distribution of Deep Gaussian process Gradients and Sequential Design for Simulators with Sharp Variations
By: Yiming Yang, Deyu Ming, Serge Guillas
Potential Business Impact:
Finds sudden changes in computer models.
Deep Gaussian Processes (DGPs), multi-layered extensions of GPs, better emulate simulators with regime transitions or sharp changes than standard GPs. Gradient information is crucial for tasks like sensitivity analysis and dimension reduction. Although gradient posteriors are well-defined in GPs, extending them to DGPs is challenging due to their hierarchical structure. We propose a novel method to approximate the DGP emulator's gradient distribution, enabling efficient gradient computation with uncertainty quantification (UQ). Our approach derives an analytical gradient mean and the covariance. The numerical results show that our method outperforms GP and DGP with finite difference methods in gradient accuracy, offering the extra unique benefit of UQ. Based on the gradient information, we further propose a sequential design criterion to identify the sharp variation regions efficiently, with the gradient norm as a key indicator whose distribution can be readily evaluated in our framework. We evaluated the proposed sequential design using synthetic examples and empirical applications, demonstrating its superior performance in emulating functions with sharp changes compared to existing design methods. The DGP gradient computation is seamlessly integrated into the advanced Python package dgpsi for DGP emulation, along with the proposed sequential design available at https://github.com/yyimingucl/DGP.
Similar Papers
Evaluating Uncertainty in Deep Gaussian Processes
Machine Learning (Stat)
Makes AI better at guessing when it's wrong.
Fast and Accurate Emulation of Complex Dynamic Simulators
Computation
Makes computer models of real things run faster.
Robust, Online, and Adaptive Decentralized Gaussian Processes
Machine Learning (Stat)
Makes computer models work better with messy data.