Score: 0

A Nutrition Multimodal Photoplethysmography Language Model

Published: November 24, 2025 | arXiv ID: 2511.19260v1

By: Kyle Verrier , Achille Nazaret , Joseph Futoma and more

Potential Business Impact:

Tracks eating habits using your pulse and food words.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Hunger and satiety dynamics shape dietary behaviors and metabolic health, yet remain difficult to capture in everyday settings. We present a Nutrition Photoplethysmography Language Model (NPLM), integrating continuous photoplethysmography (PPG) from wearables with meal descriptions. NPLM projects PPG into embeddings interpretable by language models, enabling joint reasoning over physiology and meal context. Trained on 19,340 participants and 1.1 million meal-PPG pairs, the model improved daily caloric intake prediction by 11% over text-only baselines, with accuracy maintained when 80% of meal text was removed. In an independent validation study (n=140) with controlled dining and detailed meal information, the model replicated these findings. These results demonstrate the value of integrating physiological measurements from consumer wearables with meal information for noninvasive dietary monitoring at scale.

Page Count
21 pages

Category
Computer Science:
Machine Learning (CS)