Polyharmonic Spline Packages: Composition, Efficient Procedures for Computation and Differentiation
By: Yuriy N. Bakhvalov
In a previous paper it was shown that a machine learning regression problem can be solved within the framework of random function theory, with the optimal kernel analytically derived from symmetry and indifference principles and coinciding with a polyharmonic spline. However, a direct application of that solution is limited by O(N^3) computational cost and by a breakdown of the original theoretical assumptions when the input space has excessive dimensionality. This paper proposes a cascade architecture built from packages of polyharmonic splines that simultaneously addresses scalability and is theoretically justified for problems with unknown intrinsic low dimensionality. Efficient matrix procedures are presented for forward computation and end-to-end differentiation through the cascade.
Similar Papers
Solving a Machine Learning Regression Problem Based on the Theory of Random Functions
Machine Learning (CS)
Makes computers learn without guessing rules.
Learning single-index models via harmonic decomposition
Machine Learning (CS)
Finds hidden patterns in data by looking from all sides.
Spline Interpolation on Compact Riemannian Manifolds
Computation
Maps bumpy surfaces smoothly, like a globe.