DailyLLM: Context-Aware Activity Log Generation Using Multi-Modal Sensors and LLMs
By: Ye Tian , Xiaoyuan Ren , Zihao Wang and more
Potential Business Impact:
Makes phones understand your daily life better.
Rich and context-aware activity logs facilitate user behavior analysis and health monitoring, making them a key research focus in ubiquitous computing. The remarkable semantic understanding and generation capabilities of Large Language Models (LLMs) have recently created new opportunities for activity log generation. However, existing methods continue to exhibit notable limitations in terms of accuracy, efficiency, and semantic richness. To address these challenges, we propose DailyLLM. To the best of our knowledge, this is the first log generation and summarization system that comprehensively integrates contextual activity information across four dimensions: location, motion, environment, and physiology, using only sensors commonly available on smartphones and smartwatches. To achieve this, DailyLLM introduces a lightweight LLM-based framework that integrates structured prompting with efficient feature extraction to enable high-level activity understanding. Extensive experiments demonstrate that DailyLLM outperforms state-of-the-art (SOTA) log generation methods and can be efficiently deployed on personal computers and Raspberry Pi. Utilizing only a 1.5B-parameter LLM model, DailyLLM achieves a 17% improvement in log generation BERTScore precision compared to the 70B-parameter SOTA baseline, while delivering nearly 10x faster inference speed.
Similar Papers
Enhancing Smart Environments with Context-Aware Chatbots using Large Language Models
Computation and Language
Smart homes understand you and help you better.
Context-Aware Human Behavior Prediction Using Multimodal Large Language Models: Challenges and Insights
Robotics
Helps robots understand what people will do.
Using LLMs for Late Multimodal Sensor Fusion for Activity Recognition
Machine Learning (CS)
Lets computers understand actions from sound and movement.