Symbiotic Brain-Machine Drawing via Visual Brain-Computer Interfaces
By: Gao Wang , Yingying Huang , Lars Muckli and more
Potential Business Impact:
Draw pictures with your thoughts, then AI makes them real.
Brain-computer interfaces (BCIs) are evolving from research prototypes into clinical, assistive, and performance enhancement technologies. Despite the rapid rise and promise of implantable technologies, there is a need for better and more capable wearable and non-invasive approaches whilst also minimising hardware requirements. We present a non-invasive BCI for mind-drawing that iteratively infers a subject's internal visual intent by adaptively presenting visual stimuli (probes) on a screen encoded at different flicker-frequencies and analyses the steady-state visual evoked potentials (SSVEPs). A Gabor-inspired or machine-learned policies dynamically update the spatial placement of the visual probes on the screen to explore the image space and reconstruct simple imagined shapes within approximately two minutes or less using just single-channel EEG data. Additionally, by leveraging stable diffusion models, reconstructed mental images can be transformed into realistic and detailed visual representations. Whilst we expect that similar results might be achievable with e.g. eye-tracking techniques, our work shows that symbiotic human-AI interaction can significantly increase BCI bit-rates by more than a factor 5x, providing a platform for future development of AI-augmented BCI.
Similar Papers
High-Density EEG Enables the Fastest Visual Brain-Computer Interfaces
Human-Computer Interaction
Lets brains control computers much faster.
Functional connectivity guided deep neural network for decoding high-level visual imagery
Human-Computer Interaction
Lets you control robot arms with your thoughts.
Efficient Transformer-Integrated Deep Neural Architectures for Robust EEG Decoding of Complex Visual Imagery
Human-Computer Interaction
Lets people control robot arms with their thoughts.