DIVER-1 : Deep Integration of Vast Electrophysiological Recordings at Scale
By: Danny Dongyeop Han , Yonghyeon Gwon , Ahhyun Lucy Lee and more
Electrophysiology signals such as EEG and iEEG are central to neuroscience, brain-computer interfaces, and clinical applications, yet existing foundation models remain limited in scale despite clear evidence that scaling improves performance. We introduce DIVER-1, a family of EEG and iEEG foundation models trained on the largest and most diverse corpus to date-5.3k hours of iEEG and 54k hours of EEG (1.6M channel-hours from over 17.7k subjects)-and scaled up to 1.82B parameters. We present the first systematic scaling law analysis for this domain, showing that they follow data-constrained scaling laws: for a given amount of data and compute, smaller models trained for extended epochs consistently outperform larger models trained briefly. This behavior contrasts with prior electrophysiology foundation models that emphasized model size over training duration. To achieve strong performance, we also design architectural innovations including any-variate attention, sliding temporal conditional positional encoding, and multi-domain reconstruction. DIVER-1 iEEG and EEG models each achieve state-of-the-art performance on their respective benchmarks, establishing a concrete guidelines for efficient scaling and resource allocation in electrophysiology foundation model development.
Similar Papers
Foundation Models for Cross-Domain EEG Analysis Application: A Survey
Human-Computer Interaction
Organizes brain-reading AI for better understanding.
Foundation Models for Cross-Domain EEG Analysis Application: A Survey
Human-Computer Interaction
Organizes brain-reading AI for better understanding.
REVE: A Foundation Model for EEG -- Adapting to Any Setup with Large-Scale Pretraining on 25,000 Subjects
Machine Learning (CS)
Helps AI understand brain signals from different tests.