Non-negative DAG Learning from Time-Series Data
By: Samuel Rey, Gonzalo Mateos
Potential Business Impact:
Finds hidden causes in changing data.
This work aims to learn the directed acyclic graph (DAG) that captures the instantaneous dependencies underlying a multivariate time series. The observed data follow a linear structural vector autoregressive model (SVARM) with both instantaneous and time-lagged dependencies, where the instantaneous structure is modeled by a DAG to reflect potential causal relationships. While recent continuous relaxation approaches impose acyclicity through smooth constraint functions involving powers of the adjacency matrix, they lead to non-convex optimization problems that are challenging to solve. In contrast, we assume that the underlying DAG has only non-negative edge weights, and leverage this additional structure to impose acyclicity via a convex constraint. This enables us to cast the problem of non-negative DAG recovery from multivariate time-series data as a convex optimization problem in abstract form, which we solve using the method of multipliers. Crucially, the convex formulation guarantees global optimality of the solution. Finally, we assess the performance of the proposed method on synthetic time-series data, where it outperforms existing alternatives.
Similar Papers
Learning Time-Varying Graphs from Incomplete Graph Signals
Machine Learning (Stat)
Fixes broken data by finding hidden connections.
On Conditional Independence Graph Learning From Multi-Attribute Gaussian Dependent Time Series
Machine Learning (Stat)
Helps computers understand complex data relationships.
Convex Mixed-Integer Programming for Causal Additive Models with Optimization and Statistical Guarantees
Methodology
Finds hidden connections between things from data.