Initial Model Incorporation for Deep Learning FWI: Pretraining or Denormalization?
By: Ruihua Chen , Bangyu Wu , Meng Li and more
Potential Business Impact:
Helps map underground better with smarter computer learning.
Subsurface property neural network reparameterized full waveform inversion (FWI) has emerged as an effective unsupervised learning framework, which can invert stably with an inaccurate starting model. It updates the trainable neural network parameters instead of fine-tuning on the subsurface model directly. There are primarily two ways to embed the prior knowledge of the initial model into neural networks, that is, pretraining and denormalization. Pretraining first regulates the neural networks' parameters by fitting the initial velocity model; Denormalization directly adds the outputs of the network into the initial models without pretraining. In this letter, we systematically investigate the influence of the two ways of initial model incorporation for the neural network reparameterized FWI. We demonstrate that pretraining requires inverting the model perturbation based on a constant velocity value (mean) with a two-stage implementation. It leads to a complex workflow and inconsistency of objective functions in the two-stage process, causing the network parameters to become inactive and lose plasticity. Experimental results demonstrate that denormalization can simplify workflows, accelerate convergence, and enhance inversion accuracy compared with pretraining.
Similar Papers
Physics-informed waveform inversion using pretrained wavefield neural operators
Geophysics
Makes underground maps clearer and faster.
Leveraging Deep Operator Networks (DeepONet) for Acoustic Full Waveform Inversion (FWI)
Machine Learning (CS)
Finds hidden underground things faster and better.
Full waveform inversion with CNN-based velocity representation extension
Geophysics
Improves earthquake maps by cleaning up noisy data.