Deep Neural Operator Learning for Probabilistic Models
By: Erhan Bayraktar , Qi Feng , Zecheng Zhang and more
Potential Business Impact:
Helps computers price complex financial options faster.
We propose a deep neural-operator framework for a general class of probability models. Under global Lipschitz conditions on the operator over the entire Euclidean space-and for a broad class of probabilistic models-we establish a universal approximation theorem with explicit network-size bounds for the proposed architecture. The underlying stochastic processes are required only to satisfy integrability and general tail-probability conditions. We verify these assumptions for both European and American option-pricing problems within the forward-backward SDE (FBSDE) framework, which in turn covers a broad class of operators arising from parabolic PDEs, with or without free boundaries. Finally, we present a numerical example for a basket of American options, demonstrating that the learned model produces optimal stopping boundaries for new strike prices without retraining.
Similar Papers
One model to solve them all: 2BSDE families via neural operators
Machine Learning (CS)
Solves hard math problems with smart computer programs.
Physics-Informed Latent Neural Operator for Real-time Predictions of time-dependent parametric PDEs
Machine Learning (CS)
Makes computer models of physics faster and smarter.
A Deep Learning Framework for Multi-Operator Learning: Architectures and Approximation Theory
Machine Learning (CS)
Teaches computers to solve many math problems.