Laplace Approximation For Tensor Train Kernel Machines In System Identification
By: Albert Saiapin, Kim Batselier
Potential Business Impact:
Trains computers faster for complex predictions.
To address the scalability limitations of Gaussian process (GP) regression, several approximation techniques have been proposed. One such method is based on tensor networks, which utilizes an exponential number of basis functions without incurring exponential computational cost. However, extending this model to a fully probabilistic formulation introduces several design challenges. In particular, for tensor train (TT) models, it is unclear which TT-core should be treated in a Bayesian manner. We introduce a Bayesian tensor train kernel machine that applies Laplace approximation to estimate the posterior distribution over a selected TT-core and employs variational inference (VI) for precision hyperparameters. Experiments show that core selection is largely independent of TT-ranks and feature structure, and that VI replaces cross-validation while offering up to 65x faster training. The method's effectiveness is demonstrated on an inverse dynamics problem.
Similar Papers
A Fully Probabilistic Tensor Network for Regularized Volterra System Identification
Machine Learning (Stat)
Makes complex computer models simpler and smarter.
Time Extrapolation with Graph Convolutional Autoencoder and Tensor Train Decomposition
Numerical Analysis (Math)
Predicts how things change over time, even in new situations.
Structured covariance estimation via tensor-train decomposition
Statistics Theory
Finds patterns in complex data faster.