SitLLM: Large Language Models for Sitting Posture Health Understanding via Pressure Sensor Data
By: Jian Gao , Fufangchen Zhao , Yiyang Zhang and more
Potential Business Impact:
Helps chairs tell you how to sit better.
Poor sitting posture is a critical yet often overlooked factor contributing to long-term musculoskeletal disorders and physiological dysfunctions. Existing sitting posture monitoring systems, although leveraging visual, IMU, or pressure-based modalities, often suffer from coarse-grained recognition and lack the semantic expressiveness necessary for personalized feedback. In this paper, we propose \textbf{SitLLM}, a lightweight multimodal framework that integrates flexible pressure sensing with large language models (LLMs) to enable fine-grained posture understanding and personalized health-oriented response generation. SitLLM comprises three key components: (1) a \textit{Gaussian-Robust Sensor Embedding Module} that partitions pressure maps into spatial patches and injects local noise perturbations for robust feature extraction; (2) a \textit{Prompt-Driven Cross-Modal Alignment Module} that reprograms sensor embeddings into the LLM's semantic space via multi-head cross-attention using the pre-trained vocabulary embeddings; and (3) a \textit{Multi-Context Prompt Module} that fuses feature-level, structure-level, statistical-level, and semantic-level contextual information to guide instruction comprehension.
Similar Papers
Exploring LLM-based Frameworks for Fault Diagnosis
Artificial Intelligence
AI watches machines, finds problems, explains why.
HealthSLM-Bench: Benchmarking Small Language Models for Mobile and Wearable Healthcare Monitoring
Artificial Intelligence
Lets health trackers predict problems privately.
SensorLM: Learning the Language of Wearable Sensors
Machine Learning (CS)
Lets smartwatches understand what you're doing.