Score: 1

DeepFeature: Iterative Context-aware Feature Generation for Wearable Biosignals

Published: December 9, 2025 | arXiv ID: 2512.08379v1

By: Kaiwei Liu , Yuting He , Bufang Yang and more

Potential Business Impact:

Makes health trackers understand your body better.

Business Areas:
Biometrics Biotechnology, Data and Analytics, Science and Engineering

Biosignals collected from wearable devices are widely utilized in healthcare applications. Machine learning models used in these applications often rely on features extracted from biosignals due to their effectiveness, lower data dimensionality, and wide compatibility across various model architectures. However, existing feature extraction methods often lack task-specific contextual knowledge, struggle to identify optimal feature extraction settings in high-dimensional feature space, and are prone to code generation and automation errors. In this paper, we propose DeepFeature, the first LLM-empowered, context-aware feature generation framework for wearable biosignals. DeepFeature introduces a multi-source feature generation mechanism that integrates expert knowledge with task settings. It also employs an iterative feature refinement process that uses feature assessment-based feedback for feature re-selection. Additionally, DeepFeature utilizes a robust multi-layer filtering and verification approach for robust feature-to-code translation to ensure that the extraction functions run without crashing. Experimental evaluation results show that DeepFeature achieves an average AUROC improvement of 4.21-9.67% across eight diverse tasks compared to baseline methods. It outperforms state-of-the-art approaches on five tasks while maintaining comparable performance on the remaining tasks.

Page Count
15 pages

Category
Computer Science:
Artificial Intelligence