Score: 0

Convergence Analysis of Max-Min Exponential Neural Network Operators in Orlicz Space

Published: August 14, 2025 | arXiv ID: 2508.10248v1

By: Satyaranjan Pradhan, Madan Mohan Soren

Potential Business Impact:

Makes computers learn and guess better.

In this current work, we propose a Max Min approach for approximating functions using exponential neural network operators. We extend this framework to develop the Max Min Kantorovich-type exponential neural network operators and investigate their approximation properties. We study both pointwise and uniform convergence for univariate functions. To analyze the order of convergence, we use the logarithmic modulus of continuity and estimate the corresponding rate of convergence. Furthermore, we examine the convergence behavior of the Max Min Kantorovich type exponential neural network operators within the Orlicz space setting. We provide some graphical representations to illustrate the approximation error of the function through suitable kernel and sigmoidal activation functions.

Page Count
35 pages

Category
Computer Science:
Machine Learning (CS)