Local EGOP for Continuous Index Learning
By: Alex Kokot , Anand Hemmady , Vydhourie Thiyageswaran and more
Potential Business Impact:
Teaches computers to find hidden patterns faster.
We introduce the setting of continuous index learning, in which a function of many variables varies only along a small number of directions at each point. For efficient estimation, it is beneficial for a learning algorithm to adapt, near each point $x$, to the subspace that captures the local variability of the function $f$. We pose this task as kernel adaptation along a manifold with noise, and introduce Local EGOP learning, a recursive algorithm that utilizes the Expected Gradient Outer Product (EGOP) quadratic form as both a metric and inverse-covariance of our target distribution. We prove that Local EGOP learning adapts to the regularity of the function of interest, showing that under a supervised noisy manifold hypothesis, intrinsic dimensional learning rates are achieved for arbitrarily high-dimensional noise. Empirically, we compare our algorithm to the feature learning capabilities of deep learning. Additionally, we demonstrate improved regression quality compared to two-layer neural networks in the continuous single-index setting.
Similar Papers
Faster Adaptive Optimization via Expected Gradient Outer Product Reparameterization
Machine Learning (CS)
Makes computer learning faster and more reliable.
Neural Networks Learn Generic Multi-Index Models Near Information-Theoretic Limit
Machine Learning (Stat)
Teaches computers to learn hidden patterns faster.
Evolution Strategies at the Hyperscale
Machine Learning (CS)
Makes AI learn faster and use less computer power.