Online differentially private inference in stochastic gradient descent
By: Jinhan Xie , Enze Shi , Bei Jiang and more
Potential Business Impact:
Keeps your personal data private while learning.
We propose a general privacy-preserving optimization-based framework for real-time environments without requiring trusted data curators. In particular, we introduce a noisy stochastic gradient descent algorithm for online statistical inference with streaming data under local differential privacy constraints. Unlike existing methods that either disregard privacy protection or require full access to the entire dataset, our proposed algorithm provides rigorous local privacy guarantees for individual-level data. It operates as a one-pass algorithm without re-accessing the historical data, thereby significantly reducing both time and space complexity. We also introduce online private statistical inference by conducting two construction procedures of valid private confidence intervals. We formally establish the convergence rates for the proposed estimators and present a functional central limit theorem to show the averaged solution path of these estimators weakly converges to a rescaled Brownian motion, providing a theoretical foundation for our online inference tool. Numerical simulation experiments demonstrate the finite-sample performance of our proposed procedure, underscoring its efficacy and reliability. Furthermore, we illustrate our method with an analysis of two datasets: the ride-sharing data and the US insurance data, showcasing its practical utility.
Similar Papers
Local Differential Privacy for Distributed Stochastic Aggregative Optimization with Guaranteed Optimality
Systems and Control
Lets computers learn together privately and accurately.
Federated Learning of Quantile Inference under Local Differential Privacy
Methodology
Helps computers learn from private data safely.
Online federated learning framework for classification
Machine Learning (Stat)
Teaches computers to learn from private data.