A Fully Probabilistic Tensor Network for Regularized Volterra System Identification
By: Afra Kilic, Kim Batselier
Potential Business Impact:
Makes complex computer models simpler and smarter.
Modeling nonlinear systems with Volterra series is challenging because the number of kernel coefficients grows exponentially with the model order. This work introduces Bayesian Tensor Network Volterra kernel machines (BTN-V), extending the Bayesian Tensor Network framework to Volterra system identification. BTN-V represents Volterra kernels using canonical polyadic decomposition, reducing model complexity from O(I^D) to O(DIR). By treating all tensor components and hyperparameters as random variables, BTN-V provides predictive uncertainty estimation at no additional computational cost. Sparsity-inducing hierarchical priors enable automatic rank determination and the learning of fading-memory behavior directly from data, improving interpretability and preventing overfitting. Empirical results demonstrate competitive accuracy, enhanced uncertainty quantification, and reduced computational cost.
Similar Papers
Laplace Approximation For Tensor Train Kernel Machines In System Identification
Machine Learning (Stat)
Trains computers faster for complex predictions.
Efficient Approximation of Volterra Series for High-Dimensional Systems
Machine Learning (CS)
Lets computers understand complicated systems faster.
Variational Bayesian Logistic Tensor Regression with Application to Image Recognition
Methodology
Helps computers recognize pictures with less data.