Score: 2

Learning Rate Scheduling with Matrix Factorization for Private Training

Published: November 22, 2025 | arXiv ID: 2511.17994v1

By: Nikita P. Kalinin, Joel Daniel Andersson

Potential Business Impact:

Makes private computer learning more accurate.

Business Areas:
Scheduling Information Technology, Software

We study differentially private model training with stochastic gradient descent under learning rate scheduling and correlated noise. Although correlated noise, in particular via matrix factorizations, has been shown to improve accuracy, prior theoretical work focused primarily on the prefix-sum workload. That workload assumes a constant learning rate, whereas in practice learning rate schedules are widely used to accelerate training and improve convergence. We close this gap by deriving general upper and lower bounds for a broad class of learning rate schedules in both single- and multi-epoch settings. Building on these results, we propose a learning-rate-aware factorization that achieves improvements over prefix-sum factorizations under both MaxSE and MeanSE error metrics. Our theoretical analysis yields memory-efficient constructions suitable for practical deployment, and experiments on CIFAR-10 and IMDB datasets confirm that schedule-aware factorizations improve accuracy in private training.

Repos / Data Links

Page Count
37 pages

Category
Computer Science:
Machine Learning (CS)