DySTAN: Joint Modeling of Sedentary Activity and Social Context from Smartphone Sensors
By: Aditya Sneh , Nilesh Kumar Sahu , Snehil Gupta and more
Potential Business Impact:
Helps phones understand what you're doing and who's with you.
Accurately recognizing human context from smartphone sensor data remains a significant challenge, especially in sedentary settings where activities such as studying, attending lectures, relaxing, and eating exhibit highly similar inertial patterns. Furthermore, social context plays a critical role in understanding user behavior, yet is often overlooked in mobile sensing research. To address these gaps, we introduce LogMe, a mobile sensing application that passively collects smartphone sensor data (accelerometer, gyroscope, magnetometer, and rotation vector) and prompts users for hourly self-reports capturing both sedentary activity and social context. Using this dual-label dataset, we propose DySTAN (Dynamic Cross-Stitch with Task Attention Network), a multi-task learning framework that jointly classifies both context dimensions from shared sensor inputs. It integrates task-specific layers with cross-task attention to model subtle distinctions effectively. DySTAN improves sedentary activity macro F1 scores by 21.8% over a single-task CNN-BiLSTM-GRU (CBG) model and by 8.2% over the strongest multi-task baseline, Sluice Network (SN). These results demonstrate the importance of modeling multiple, co-occurring context dimensions to improve the accuracy and robustness of mobile context recognition.
Similar Papers
Facilitating Individuals' Sensemaking about Sedentary Behavior via Contextualized Data
Human-Computer Interaction
Helps people move more by showing activity info.
Human Activity Recognition from Smartphone Sensor Data for Clinical Trials
Machine Learning (CS)
Phone app spots how you move.
Robust In-the-Wild Exercise Recognition from a Single Wearable: Data-Side Fusion, Sensor Rotation, and Feature Engineering
Human-Computer Interaction
Lets one sensor track exercises accurately.