Fourier Low-rank and Sparse Tensor for Efficient Tensor Completion
By: Jingyang Li, Jiuqian Shang, Yang Chen
Potential Business Impact:
Fixes missing data in science faster.
Tensor completion is crucial in many scientific domains with missing data problems. Traditional low-rank tensor models, including CP, Tucker, and Tensor-Train, exploit low-dimensional structures to recover missing data. However, these methods often treat all tensor modes symmetrically, failing to capture the unique spatiotemporal patterns inherent in scientific data, where the temporal component exhibits both low-frequency stability and high-frequency variations. To address this, we propose a novel model, \underline{F}ourier \underline{Lo}w-rank and \underline{S}parse \underline{T}ensor (FLoST), which decomposes the tensor along the temporal dimension using a Fourier transform. This approach captures low-frequency components with low-rank matrices and high-frequency fluctuations with sparsity, resulting in a hybrid structure that efficiently models both smooth and localized variations. Compared to the well-known tubal-rank model, which assumes low-rankness across all frequency components, FLoST requires significantly fewer parameters, making it computationally more efficient, particularly when the time dimension is large. Through theoretical analysis and empirical experiments, we demonstrate that FLoST outperforms existing tensor completion models in terms of both accuracy and computational efficiency, offering a more interpretable solution for spatiotemporal data reconstruction.
Similar Papers
Traffic Flow Data Completion and Anomaly Diagnosis via Sparse and Low-Rank Tensor Optimization
Optimization and Control
Fixes broken traffic data to find problems.
Global and Local Structure Learning for Sparse Tensor Completion
Machine Learning (CS)
Fills in missing data by learning patterns.
Tensor Train Completion from Fiberwise Observations Along a Single Mode
Numerical Analysis
Reconstructs missing data using patterns.