Score: 1

A Compositional Kernel Model for Feature Learning

Published: September 17, 2025 | arXiv ID: 2509.14158v1

By: Feng Ruan, Keli Liu, Michael Jordan

BigTech Affiliations: University of California, Berkeley

Potential Business Impact:

Finds important information, ignores junk.

Business Areas:
Image Recognition Data and Analytics, Software

We study a compositional variant of kernel ridge regression in which the predictor is applied to a coordinate-wise reweighting of the inputs. Formulated as a variational problem, this model provides a simple testbed for feature learning in compositional architectures. From the perspective of variable selection, we show how relevant variables are recovered while noise variables are eliminated. We establish guarantees showing that both global minimizers and stationary points discard noise coordinates when the noise variables are Gaussian distributed. A central finding is that $\ell_1$-type kernels, such as the Laplace kernel, succeed in recovering features contributing to nonlinear effects at stationary points, whereas Gaussian kernels recover only linear ones.

Country of Origin
🇺🇸 United States

Page Count
43 pages

Category
Computer Science:
Machine Learning (CS)