Score: 0

Error Analysis of Generalized Langevin Equations with Approximated Memory Kernels

Published: December 11, 2025 | arXiv ID: 2512.10256v1

By: Quanjun Lang, Jianfeng Lu

Potential Business Impact:

Improves predictions by understanding how past events affect future ones.

Business Areas:
A/B Testing Data and Analytics

We analyze prediction error in stochastic dynamical systems with memory, focusing on generalized Langevin equations (GLEs) formulated as stochastic Volterra equations. We establish that, under a strongly convex potential, trajectory discrepancies decay at a rate determined by the decay of the memory kernel and are quantitatively bounded by the estimation error of the kernel in a weighted norm. Our analysis integrates synchronized noise coupling with a Volterra comparison theorem, encompassing both subexponential and exponential kernel classes. For first-order models, we derive moment and perturbation bounds using resolvent estimates in weighted spaces. For second-order models with confining potentials, we prove contraction and stability under kernel perturbations using a hypocoercive Lyapunov-type distance. This framework accommodates non-translation-invariant kernels and white-noise forcing, explicitly linking improved kernel estimation to enhanced trajectory prediction. Numerical examples validate these theoretical findings.

Country of Origin
🇺🇸 United States

Page Count
24 pages

Category
Statistics:
Machine Learning (Stat)