Learning Dynamics from Input-Output Data with Hamiltonian Gaussian Processes
By: Jan-Hendrik Ewering , Robin E. Herrmann , Niklas Wahlström and more
Potential Business Impact:
Teaches robots physics from just watching.
Embedding non-restrictive prior knowledge, such as energy conservation laws, in learning-based approaches is a key motive to construct physically consistent models from limited data, relevant for, e.g., model-based control. Recent work incorporates Hamiltonian dynamics into Gaussian Process (GP) regression to obtain uncertainty-quantifying models that adhere to the underlying physical principles. However, these works rely on velocity or momentum data, which is rarely available in practice. In this paper, we consider dynamics learning with non-conservative Hamiltonian GPs, and address the more realistic problem setting of learning from input-output data. We provide a fully Bayesian scheme for estimating probability densities of unknown hidden states, of GP hyperparameters, as well as of structural hyperparameters, such as damping coefficients. Considering the computational complexity of GPs, we take advantage of a reduced-rank GP approximation and leverage its properties for computationally efficient prediction and training. The proposed method is evaluated in a nonlinear simulation case study and compared to a state-of-the-art approach that relies on momentum measurements.
Similar Papers
Learning Generalized Hamiltonian Dynamics with Stability from Noisy Trajectory Data
Machine Learning (CS)
Teaches computers to predict how things move.
Neural non-canonical Hamiltonian dynamics for long-time simulations
Machine Learning (CS)
Teaches computers to predict how things move.
Learning Passive Continuous-Time Dynamics with Multistep Port-Hamiltonian Gaussian Processes
Machine Learning (CS)
Learns how things move, even with bad data.