CoVariance Filters and Neural Networks over Hilbert Spaces
By: Claudio Battiloro, Andrea Cavallo, Elvin Isufi
Potential Business Impact:
Teaches computers to learn from complex, changing data.
CoVariance Neural Networks (VNNs) perform graph convolutions on the empirical covariance matrix of signals defined over finite-dimensional Hilbert spaces, motivated by robustness and transferability properties. Yet, little is known about how these arguments extend to infinite-dimensional Hilbert spaces. In this work, we take a first step by introducing a novel convolutional learning framework for signals defined over infinite-dimensional Hilbert spaces, centered on the (empirical) covariance operator. We constructively define Hilbert coVariance Filters (HVFs) and design Hilbert coVariance Networks (HVNs) as stacks of HVF filterbanks with nonlinear activations. We propose a principled discretization procedure, and we prove that empirical HVFs can recover the Functional PCA (FPCA) of the filtered signals. We then describe the versatility of our framework with examples ranging from multivariate real-valued functions to reproducing kernel Hilbert spaces. Finally, we validate HVNs on both synthetic and real-world time-series classification tasks, showing robust performance compared to MLP and FPCA-based classifiers.
Similar Papers
CoVariance Filters and Neural Networks over Hilbert Spaces
Machine Learning (CS)
Teaches computers to learn from complex, changing data.
Precision Neural Networks: Joint Graph And Relational Learning
Machine Learning (CS)
Helps computers learn better by understanding data connections.
Covariance Scattering Transforms
Machine Learning (CS)
Finds hidden patterns in data, even with little information.