Score: 0

SitLLM: Large Language Models for Sitting Posture Health Understanding via Pressure Sensor Data

Published: September 16, 2025 | arXiv ID: 2509.12994v1

By: Jian Gao , Fufangchen Zhao , Yiyang Zhang and more

Potential Business Impact:

Helps chairs tell you how to sit better.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Poor sitting posture is a critical yet often overlooked factor contributing to long-term musculoskeletal disorders and physiological dysfunctions. Existing sitting posture monitoring systems, although leveraging visual, IMU, or pressure-based modalities, often suffer from coarse-grained recognition and lack the semantic expressiveness necessary for personalized feedback. In this paper, we propose \textbf{SitLLM}, a lightweight multimodal framework that integrates flexible pressure sensing with large language models (LLMs) to enable fine-grained posture understanding and personalized health-oriented response generation. SitLLM comprises three key components: (1) a \textit{Gaussian-Robust Sensor Embedding Module} that partitions pressure maps into spatial patches and injects local noise perturbations for robust feature extraction; (2) a \textit{Prompt-Driven Cross-Modal Alignment Module} that reprograms sensor embeddings into the LLM's semantic space via multi-head cross-attention using the pre-trained vocabulary embeddings; and (3) a \textit{Multi-Context Prompt Module} that fuses feature-level, structure-level, statistical-level, and semantic-level contextual information to guide instruction comprehension.

Country of Origin
🇨🇳 China

Page Count
10 pages

Category
Computer Science:
Computation and Language