Error Analysis of Generalized Langevin Equations with Approximated Memory Kernels
By: Quanjun Lang, Jianfeng Lu
Potential Business Impact:
Improves predictions by understanding how past events affect future ones.
We analyze prediction error in stochastic dynamical systems with memory, focusing on generalized Langevin equations (GLEs) formulated as stochastic Volterra equations. We establish that, under a strongly convex potential, trajectory discrepancies decay at a rate determined by the decay of the memory kernel and are quantitatively bounded by the estimation error of the kernel in a weighted norm. Our analysis integrates synchronized noise coupling with a Volterra comparison theorem, encompassing both subexponential and exponential kernel classes. For first-order models, we derive moment and perturbation bounds using resolvent estimates in weighted spaces. For second-order models with confining potentials, we prove contraction and stability under kernel perturbations using a hypocoercive Lyapunov-type distance. This framework accommodates non-translation-invariant kernels and white-noise forcing, explicitly linking improved kernel estimation to enhanced trajectory prediction. Numerical examples validate these theoretical findings.
Similar Papers
Moment Estimate and Variational Approach for Learning Generalized Diffusion with Non-gradient Structures
Computational Physics
Finds hidden rules in how things move.
Numerical analysis of a particle system for the calibrated Heston-type local stochastic volatility model
Computational Finance
Makes computer models of money more accurate.
Kalman-Langevin dynamics : exponential convergence, particle approximation and numerical approximation
Probability
Makes computer models learn faster and more accurately.