Multi-Frequency Federated Learning for Human Activity Recognition Using Head-Worn Sensors
By: Dario Fenoglio , Mohan Li , Davide Casnici and more
Potential Business Impact:
Lets earbuds learn health from you privately.
Human Activity Recognition (HAR) benefits various application domains, including health and elderly care. Traditional HAR involves constructing pipelines reliant on centralized user data, which can pose privacy concerns as they necessitate the uploading of user data to a centralized server. This work proposes multi-frequency Federated Learning (FL) to enable: (1) privacy-aware ML; (2) joint ML model learning across devices with varying sampling frequency. We focus on head-worn devices (e.g., earbuds and smart glasses), a relatively unexplored domain compared to traditional smartwatch- or smartphone-based HAR. Results have shown improvements on two datasets against frequency-specific approaches, indicating a promising future in the multi-frequency FL-HAR task. The proposed network's implementation is publicly available for further research and development.
Similar Papers
MHARFedLLM: Multimodal Human Activity Recognition Using Federated Large Language Model
Machine Learning (CS)
Helps computers understand what people are doing.
GraMFedDHAR: Graph Based Multimodal Differentially Private Federated HAR
Machine Learning (CS)
Helps computers understand actions from many sensors.
A Novel Deep Hybrid Framework with Ensemble-Based Feature Optimization for Robust Real-Time Human Activity Recognition
CV and Pattern Recognition
Helps computers understand what people are doing.