Score: 0

Entropic Analysis of Time Series through Kernel Density Estimation

Published: March 24, 2025 | arXiv ID: 2503.18916v1

By: Audun Myers , Bill Kay , Iliana Alvarez and more

Potential Business Impact:

Finds hidden patterns in signals and heartbeats.

Business Areas:
Text Analytics Data and Analytics, Software

This work presents a novel framework for time series analysis using entropic measures based on the kernel density estimate (KDE) of the time series' Takens' embeddings. Using this framework we introduce two distinct analytical tools: (1) a multi-scale KDE entropy metric, denoted as $\Delta\text{KE}$, which quantifies the evolution of time series complexity across different scales by measuring certain entropy changes, and (2) a sliding baseline method that employs the Kullback-Leibler (KL) divergence to detect changes in time series dynamics through changes in KDEs. The $\Delta{\rm KE}$ metric offers insights into the information content and ``unfolding'' properties of the time series' embedding related to dynamical systems, while the KL divergence-based approach provides a noise and outlier robust approach for identifying time series change points (injections in RF signals, e.g.). We demonstrate the versatility and effectiveness of these tools through a set of experiments encompassing diverse domains. In the space of radio frequency (RF) signal processing, we achieve accurate detection of signal injections under varying noise and interference conditions. Furthermore, we apply our methodology to electrocardiography (ECG) data, successfully identifying instances of ventricular fibrillation with high accuracy. Finally, we demonstrate the potential of our tools for dynamic state detection by accurately identifying chaotic regimes within an intermittent signal. These results show the broad applicability of our framework for extracting meaningful insights from complex time series data across various scientific disciplines.

Page Count
20 pages

Category
Computer Science:
Information Theory