Tensor Sketch: Fast and Scalable Polynomial Kernel Approximation
By: Ninh Pham, Rasmus Pagh
Potential Business Impact:
Speeds up computer learning with complex math.
Approximation of non-linear kernels using random feature maps has become a powerful technique for scaling kernel methods to large datasets. We propose $\textit{Tensor Sketch}$, an efficient random feature map for approximating polynomial kernels. Given $n$ training samples in $\mathbb{R}^d$ Tensor Sketch computes low-dimensional embeddings in $\mathbb{R}^D$ in time $\mathcal{O}\left( n(d+D \log{D}) \right)$ making it well-suited for high-dimensional and large-scale settings. We provide theoretical guarantees on the approximation error, ensuring the fidelity of the resulting kernel function estimates. We also discuss extensions and highlight applications where Tensor Sketch serves as a central computational tool.
Similar Papers
Norming Sets for Tensor and Polynomial Sketching
Numerical Analysis
Makes complex math shapes easier for computers.
Random feature approximation for general spectral methods
Machine Learning (Stat)
Makes AI learn better and faster.
Sublinear Sketches for Approximate Nearest Neighbor and Kernel Density Estimation
Machine Learning (CS)
Finds important patterns in fast-changing data.