Decoding Workload and Agreement From EEG During Spoken Dialogue With Conversational AI
By: Lucija Mihić Zidar , Philipp Wicke , Praneel Bhatia and more
Potential Business Impact:
Lets computers understand your thoughts during talking.
Passive brain-computer interfaces offer a potential source of implicit feedback for alignment of large language models, but most mental state decoding has been done in controlled tasks. This paper investigates whether established EEG classifiers for mental workload and implicit agreement can be transferred to spoken human-AI dialogue. We introduce two conversational paradigms - a Spelling Bee task and a sentence completion task- and an end-to-end pipeline for transcribing, annotating, and aligning word-level conversational events with continuous EEG classifier output. In a pilot study, workload decoding showed interpretable trends during spoken interaction, supporting cross-paradigm transfer. For implicit agreement, we demonstrate continuous application and precise temporal alignment to conversational events, while identifying limitations related to construct transfer and asynchronous application of event-based classifiers. Overall, the results establish feasibility and constraints for integrating passive BCI signals into conversational AI systems.
Similar Papers
Neural Decoding of Overt Speech from ECoG Using Vision Transformers and Contrastive Representation Learning
Artificial Intelligence
Lets paralyzed people talk by reading brain signals.
On Creating A Brain-To-Text Decoder
Machine Learning (CS)
Lets computers understand what you're thinking.
EEG-to-Voice Decoding of Spoken and Imagined speech Using Non-Invasive EEG
Signal Processing
Lets people talk by reading their brainwaves.