When Bayesian Tensor Completion Meets Multioutput Gaussian Processes: Functional Universality and Rank Learning
By: Siyuan Li , Shikai Fang , Lei Cheng and more
Functional tensor decomposition can analyze multi-dimensional data with real-valued indices, paving the path for applications in machine learning and signal processing. A limitation of existing approaches is the assumption that the tensor rank-a critical parameter governing model complexity-is known. However, determining the optimal rank is a non-deterministic polynomial-time hard (NP-hard) task and there is a limited understanding regarding the expressive power of functional low-rank tensor models for continuous signals. We propose a rank-revealing functional Bayesian tensor completion (RR-FBTC) method. Modeling the latent functions through carefully designed multioutput Gaussian processes, RR-FBTC handles tensors with real-valued indices while enabling automatic tensor rank determination during the inference process. We establish the universal approximation property of the model for continuous multi-dimensional signals, demonstrating its expressive power in a concise format. To learn this model, we employ the variational inference framework and derive an efficient algorithm with closed-form updates. Experiments on both synthetic and real-world datasets demonstrate the effectiveness and superiority of the RR-FBTC over state-of-the-art approaches. The code is available at https://github.com/OceanSTARLab/RR-FBTC.
Similar Papers
Compressed Bayesian Tensor Regression
Methodology
Makes complex data analysis faster and more accurate.
Interpretable Bayesian Tensor Network Kernel Machines with Automatic Rank and Feature Selection
Machine Learning (Stat)
Learns faster, smarter, and shows its work.
Function-on-Function Bayesian Optimization
Machine Learning (Stat)
Finds best settings for complex computer programs.