Convergence Analysis of Max-Min Exponential Neural Network Operators in Orlicz Space
By: Satyaranjan Pradhan, Madan Mohan Soren
Potential Business Impact:
Makes computers learn and guess better.
In this current work, we propose a Max Min approach for approximating functions using exponential neural network operators. We extend this framework to develop the Max Min Kantorovich-type exponential neural network operators and investigate their approximation properties. We study both pointwise and uniform convergence for univariate functions. To analyze the order of convergence, we use the logarithmic modulus of continuity and estimate the corresponding rate of convergence. Furthermore, we examine the convergence behavior of the Max Min Kantorovich type exponential neural network operators within the Orlicz space setting. We provide some graphical representations to illustrate the approximation error of the function through suitable kernel and sigmoidal activation functions.
Similar Papers
Max-Min Neural Network Operators For Approximation of Multivariate Functions
Machine Learning (CS)
Makes computers learn and guess better.
Minimax rates for learning kernels in operators
Statistics Theory
Improves math models by learning from data.
Towards Sharp Minimax Risk Bounds for Operator Learning
Statistics Theory
Learns how things work from few messy examples.