Score: 0

Laplace Approximation For Tensor Train Kernel Machines In System Identification

Published: December 2, 2025 | arXiv ID: 2512.02532v1

By: Albert Saiapin, Kim Batselier

Potential Business Impact:

Trains computers faster for complex predictions.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

To address the scalability limitations of Gaussian process (GP) regression, several approximation techniques have been proposed. One such method is based on tensor networks, which utilizes an exponential number of basis functions without incurring exponential computational cost. However, extending this model to a fully probabilistic formulation introduces several design challenges. In particular, for tensor train (TT) models, it is unclear which TT-core should be treated in a Bayesian manner. We introduce a Bayesian tensor train kernel machine that applies Laplace approximation to estimate the posterior distribution over a selected TT-core and employs variational inference (VI) for precision hyperparameters. Experiments show that core selection is largely independent of TT-ranks and feature structure, and that VI replaces cross-validation while offering up to 65x faster training. The method's effectiveness is demonstrated on an inverse dynamics problem.

Country of Origin
🇳🇱 Netherlands

Page Count
6 pages

Category
Statistics:
Machine Learning (Stat)