Learning Generalized Hamiltonian Dynamics with Stability from Noisy Trajectory Data
By: Luke McLennan , Yi Wang , Ryan Farell and more
Potential Business Impact:
Teaches computers to predict how things move.
We introduce a robust framework for learning various generalized Hamiltonian dynamics from noisy, sparse phase-space data and in an unsupervised manner based on variational Bayesian inference. Although conservative, dissipative, and port-Hamiltonian systems might share the same initial total energy of a closed system, it is challenging for a single Hamiltonian network model to capture the distinctive and varying motion dynamics and physics of a phase space, from sampled observational phase space trajectories. To address this complicated Hamiltonian manifold learning challenge, we extend sparse symplectic, random Fourier Gaussian processes learning with predictive successive numerical estimations of the Hamiltonian landscape, using a generalized form of state and conjugate momentum Hamiltonian dynamics, appropriate to different classes of conservative, dissipative and port-Hamiltonian physical systems. In addition to the kernelized evidence lower bound (ELBO) loss for data fidelity, we incorporate stability and conservation constraints as additional hyper-parameter balanced loss terms to regularize the model's multi-gradients, enforcing physics correctness for improved prediction accuracy with bounded uncertainty.
Similar Papers
Learning Dynamics from Input-Output Data with Hamiltonian Gaussian Processes
Machine Learning (CS)
Teaches robots physics from just watching.
Learning Passive Continuous-Time Dynamics with Multistep Port-Hamiltonian Gaussian Processes
Machine Learning (CS)
Learns how things move, even with bad data.
Learning Passive Continuous-Time Dynamics with Multistep Port-Hamiltonian Gaussian Processes
Machine Learning (CS)
Learns how things move by watching them.