Initialization of a Polyharmonic Cascade, Launch and Testing
By: Yuriy N. Bakhvalov
This paper concludes a series of studies on the polyharmonic cascade, a deep machine learning architecture theoretically derived from indifference principles and the theory of random functions. A universal initialization procedure is proposed, based on symmetric constellations in the form of hyperoctahedra with a central point. This initialization not only ensures stable training of cascades with tens and hundreds of layers (up to 500 layers without skip connections), but also radically simplifies the computations. Scalability and robustness are demonstrated on MNIST (98.3% without convolutions or augmentations), HIGGS (AUC approximately 0.885 on 11M examples), and Epsilon (AUC approximately 0.963 with 2000 features). All linear algebra is reduced to 2D operations and is efficiently executed on GPUs. A public repository and an archived snapshot are provided for full reproducibility.
Similar Papers
Polyharmonic Cascade
Machine Learning (CS)
Teaches computers to learn faster without mistakes.
Polyharmonic Spline Packages: Composition, Efficient Procedures for Computation and Differentiation
Machine Learning (CS)
Makes computers learn from messy data faster.
The Adaptive Vekua Cascade: A Differentiable Spectral-Analytic Solver for Physics-Informed Representation
Machine Learning (CS)
Solves hard science problems with fewer computer parts.