Max-Min Neural Network Operators For Approximation of Multivariate Functions
By: Abhishek Yadav, Uaday Singh, Feng Dai
In this paper, we develop a multivariate framework for approximation by max-min neural network operators. Building on the recent advances in approximation theory by neural network operators, particularly, the univariate max-min operators, we propose and analyze new multivariate operators activated by sigmoidal functions. We establish pointwise and uniform convergence theorems and derive quantitative estimates for the order of approximation via modulus of continuity and multivariate generalized absolute moment. Our results demonstrate that multivariate max-min structure of operators, besides their algebraic elegance, provide efficient and stable approximation tools in both theoretical and applied settings.
Similar Papers
Convergence Analysis of Max-Min Exponential Neural Network Operators in Orlicz Space
Machine Learning (CS)
Makes computers learn and guess better.
A Deep Learning Framework for Multi-Operator Learning: Architectures and Approximation Theory
Machine Learning (CS)
Teaches computers to solve many math problems.
Towards Sharp Minimax Risk Bounds for Operator Learning
Statistics Theory
Learns how things work from few messy examples.