Towards Attention-Aware Large Language Models: Integrating Real-Time Eye-Tracking and EEG for Adaptive AI Responses
By: Dan Zhang
Potential Business Impact:
Helps computers know when you're not paying attention.
This project proposes an attention-aware LLM that integrates EEG and eye tracking to monitor and measure user attention dynamically. To realize this, the project will integrate real-time EEG and eye-tracking data into an LLM-based interactive system and classify the user's attention state on the fly. The system can identify five attention states: High Attention, Stable Attention, Dropping Attention, Cognitive Overload, and Distraction. It responds accordingly to each state, with a particular focus on adapting to decreased attention, distraction, and cognitive overload to improve user engagement and reduce cognitive load.
Similar Papers
Multimodal Behavioral Patterns Analysis with Eye-Tracking and LLM-Based Reasoning
Human-Computer Interaction
Helps computers understand how people look at things.
Fine-Tuning Large Language Models Using EEG Microstate Features for Mental Workload Assessment
Human-Computer Interaction
Helps computers understand how much you're thinking.
GazeLLM: Multimodal LLMs incorporating Human Visual Attention
Human-Computer Interaction
Lets computers understand videos by watching eyes.