Randomized Quasi-Monte Carlo Features for Kernel Approximation
By: Yian Huang, Zhen Huang
Potential Business Impact:
Makes computer learning faster and more accurate.
We investigate the application of randomized quasi-Monte Carlo (RQMC) methods in random feature approximations for kernel-based learning. Compared to the classical Monte Carlo (MC) approach \citep{rahimi2007random}, RQMC improves the deterministic approximation error bound from $O_P(1/\sqrt{M})$ to $O(1/M)$ (up to logarithmic factors), matching the rate achieved by quasi-Monte Carlo (QMC) methods \citep{huangquasi}. Beyond the deterministic error bound guarantee, we further establish additional average error bounds for RQMC features: some requiring weaker assumptions and others significantly reducing the exponent of the logarithmic factor. In the context of kernel ridge regression, we show that RQMC features offer computational advantages over MC features while preserving the same statistical error rate. Empirical results further show that RQMC methods maintain stable performance in both low and moderately high-dimensional settings, unlike QMC methods, which suffer from significant performance degradation as dimension increases.
Similar Papers
Comparative Study of Monte Carlo and Quasi-Monte Carlo Techniques for Enhanced Derivative Pricing
Pricing of Securities
Makes financial math faster and more accurate.
Quasi-Monte Carlo methods for uncertainty quantification of tumor growth modeled by a parametric semi-linear parabolic reaction-diffusion equation
Numerical Analysis
Predicts tumor growth more accurately for better treatment.
High-Dimensional Quasi-Monte Carlo via Combinatorial Discrepancy
Numerical Analysis
Makes computer math faster for hard problems.