Polyharmonic Cascade
By: Yuriy N. Bakhvalov
This paper presents a deep machine learning architecture, the "polyharmonic cascade" -- a sequence of packages of polyharmonic splines, where each layer is rigorously derived from the theory of random functions and the principles of indifference. This makes it possible to approximate nonlinear functions of arbitrary complexity while preserving global smoothness and a probabilistic interpretation. For the polyharmonic cascade, a training method alternative to gradient descent is proposed: instead of directly optimizing the coefficients, one solves a single global linear system on each batch with respect to the function values at fixed "constellations" of nodes. This yields synchronized updates of all layers, preserves the probabilistic interpretation of individual layers and theoretical consistency with the original model, and scales well: all computations reduce to 2D matrix operations efficiently executed on a GPU. Fast learning without overfitting on MNIST is demonstrated.
Similar Papers
Polyharmonic Spline Packages: Composition, Efficient Procedures for Computation and Differentiation
Machine Learning (CS)
Makes computers learn from messy data faster.
Solving a Machine Learning Regression Problem Based on the Theory of Random Functions
Machine Learning (CS)
Makes computers learn without guessing rules.
Hierarchical Physics-Embedded Learning for Spatiotemporal Dynamical Systems
Machine Learning (CS)
Finds hidden science rules from messy data.