Score: 0

From Universal Approximation Theorem to Tropical Geometry of Multi-Layer Perceptrons

Published: October 16, 2025 | arXiv ID: 2510.15012v1

By: Yi-Shan Chu, Yueh-Cheng Kuo

Potential Business Impact:

Teaches computers to draw shapes before learning.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

We revisit the Universal Approximation Theorem(UAT) through the lens of the tropical geometry of neural networks and introduce a constructive, geometry-aware initialization for sigmoidal multi-layer perceptrons (MLPs). Tropical geometry shows that Rectified Linear Unit (ReLU) networks admit decision functions with a combinatorial structure often described as a tropical rational, namely a difference of tropical polynomials. Focusing on planar binary classification, we design purely sigmoidal MLPs that adhere to the finite-sum format of UAT: a finite linear combination of shifted and scaled sigmoids of affine functions. The resulting models yield decision boundaries that already align with prescribed shapes at initialization and can be refined by standard training if desired. This provides a practical bridge between the tropical perspective and smooth MLPs, enabling interpretable, shape-driven initialization without resorting to ReLU architectures. We focus on the construction and empirical demonstrations in two dimensions; theoretical analysis and higher-dimensional extensions are left for future work.

Country of Origin
🇹🇼 Taiwan, Province of China

Page Count
26 pages

Category
Statistics:
Machine Learning (Stat)