Reusing Samples in Variance Reduction
By: Yujia Jin , Ishani Karmarkar , Aaron Sidford and more
Potential Business Impact:
Makes computer problem-solving faster and smarter.
We provide a general framework to improve trade-offs between the number of full batch and sample queries used to solve structured optimization problems. Our results apply to a broad class of randomized optimization algorithms that iteratively solve sub-problems to high accuracy. We show that such algorithms can be modified to reuse the randomness used to query the input across sub-problems. Consequently, we improve the trade-off between the number of gradient (full batch) and individual function (sample) queries for finite sum minimization, the number of matrix-vector multiplies (full batch) and random row (sample) queries for top-eigenvector computation, and the number of matrix-vector multiplies with the transition matrix (full batch) and generative model (sample) queries for optimizing Markov Decision Processes. To facilitate our analysis we introduce the notion of pseudo-independent algorithms, a generalization of pseudo-deterministic algorithms [Gat and Goldwasser 2011], that quantifies how independent the output of a randomized algorithm is from a randomness source.
Similar Papers
A simple analysis of a quantum-inspired algorithm for solving low-rank linear systems
Data Structures and Algorithms
Finds answers to math problems much faster.
Stochastic Optimization with Random Search
Optimization and Control
Improves computer guessing for tricky problems.
Sublinear Algorithms for Wasserstein and Total Variation Distances: Applications to Fairness and Privacy Auditing
Machine Learning (CS)
Learns how data is spread without seeing all of it.