Preconditioned Truncated Single-Sample Estimators for Scalable Stochastic Optimization
By: Tianshi Xu , Difeng Cai , Hua Huang and more
Potential Business Impact:
Makes computer math problems faster and more accurate.
Many large-scale stochastic optimization algorithms involve repeated solutions of linear systems or evaluations of log-determinants. In these regimes, computing exact solutions is often unnecessary; it is more computationally efficient to construct unbiased stochastic estimators with controlled variance. However, classical iterative solvers incur truncation bias, whereas unbiased Krylov-based estimators typically exhibit high variance and numerical instability. To mitigate these issues, we introduce the Preconditioned Truncated Single-Sample (PTSS) estimators--a family of stochastic Krylov methods that integrate preconditioning with truncated Lanczos iterations. PTSS yields low-variance, stable estimators for linear system solutions, log-determinants, and their derivatives. We establish theoretical results on their mean, variance, and concentration properties, explicitly quantifying the variance reduction induced by preconditioning. Numerical experiments confirm that PTSS achieves superior stability and variance control compared with existing unbiased and biased alternatives, providing an efficient framework for stochastic optimization.
Similar Papers
Optimal Krylov On Average
Numerical Analysis
Makes computer math problems solve faster.
Private Statistical Estimation via Truncation
Machine Learning (CS)
Protects private data while learning from it.
High Probability Complexity Bounds of Trust-Region Stochastic Sequential Quadratic Programming with Heavy-Tailed Noise
Optimization and Control
Solves hard math problems with noisy guesses.