Kantorovich-Type Stochastic Neural Network Operators for the Mean-Square Approximation of Certain Second-Order Stochastic Processes
By: Sachin Saini, Uaday Singh
Potential Business Impact:
Makes computers learn from unpredictable events.
Artificial neural network operators (ANNOs) have been widely used for approximating deterministic input-output functions; however, their extension to random dynamics remains comparatively unexplored. In this paper, we construct a new class of \textbf{Kantorovich-type Stochastic Neural Network Operators (K-SNNOs)} in which randomness is incorporated not at the coefficient level, but through \textbf{stochastic neurons} driven by stochastic integrators. This framework enables the operator to inherit the probabilistic structure of the underlying process, making it suitable for modeling and approximating stochastic signals. We establish mean-square convergence of K-SNNOs to the target stochastic process and derive quantitative error estimates expressing the rate of approximation in terms of the modulus of continuity. Numerical simulations further validate the theoretical results by demonstrating accurate reconstruction of sample paths and rapid decay of the mean square error (MSE). Graphical results, including sample-wise approximations and empirical MSE behaviour, illustrate the robustness and effectiveness of the proposed stochastic-neuron-based operator.
Similar Papers
Constructive Approximation of Random Process via Stochastic Interpolation Neural Network Operators
Machine Learning (Stat)
Helps predict disease spread using smart math.
Fourier Neural Operators for Non-Markovian Processes:Approximation Theorems and Experiments
Machine Learning (CS)
Learns how random things change much faster.
Learning Solution Operators for Partial Differential Equations via Monte Carlo-Type Approximation
Machine Learning (CS)
Makes computer models solve problems faster and cheaper.