Multi-Preconditioned LBFGS for Training Finite-Basis PINNs
By: Marc Salvadó-Benasco , Aymane Kssim , Alexander Heinlein and more
A multi-preconditioned LBFGS (MP-LBFGS) algorithm is introduced for training finite-basis physics-informed neural networks (FBPINNs). The algorithm is motivated by the nonlinear additive Schwarz method and exploits the domain-decomposition-inspired additive architecture of FBPINNs, in which local neural networks are defined on subdomains, thereby localizing the network representation. Parallel, subdomain-local quasi-Newton corrections are then constructed on the corresponding local parts of the architecture. A key feature is a novel nonlinear multi-preconditioning mechanism, in which subdomain corrections are optimally combined through the solution of a low-dimensional subspace minimization problem. Numerical experiments indicate that MP-LBFGS can improve convergence speed, as well as model accuracy over standard LBFGS while incurring lower communication overhead.
Similar Papers
A matrix preconditioning framework for physics-informed neural networks based on adjoint method
Numerical Analysis
Makes computer models solve hard science problems faster.
Optimizing the Optimizer for Physics-Informed Neural Networks and Kolmogorov-Arnold Networks
Machine Learning (CS)
Makes computer models of science problems learn faster.
AB-PINNs: Adaptive-Basis Physics-Informed Neural Networks for Residual-Driven Domain Decomposition
Machine Learning (CS)
Helps computers solve hard math problems faster.