Precision Neural Networks: Joint Graph And Relational Learning
By: Andrea Cavallo , Samuel Rey , Antonio G. Marques and more
Potential Business Impact:
Helps computers learn better by understanding data connections.
CoVariance Neural Networks (VNNs) perform convolutions on the graph determined by the covariance matrix of the data, which enables expressive and stable covariance-based learning. However, covariance matrices are typically dense, fail to encode conditional independence, and are often precomputed in a task-agnostic way, which may hinder performance. To overcome these limitations, we study Precision Neural Networks (PNNs), i.e., VNNs on the precision matrix -- the inverse covariance. The precision matrix naturally encodes statistical independence, often exhibits sparsity, and preserves the covariance spectral structure. To make precision estimation task-aware, we formulate an optimization problem that jointly learns the network parameters and the precision matrix, and solve it via alternating optimization, by sequentially updating the network weights and the precision estimate. We theoretically bound the distance between the estimated and true precision matrices at each iteration, and demonstrate the effectiveness of joint estimation compared to two-step approaches on synthetic and real-world data.
Similar Papers
Covariance Density Neural Networks
Machine Learning (CS)
Finds brain signals better for mind-controlled devices.
CoVariance Filters and Neural Networks over Hilbert Spaces
Machine Learning (CS)
Teaches computers to learn from complex, changing data.
CoVariance Filters and Neural Networks over Hilbert Spaces
Machine Learning (CS)
Teaches computers to learn from complex, changing data.