Score: 0

Symbiotic Brain-Machine Drawing via Visual Brain-Computer Interfaces

Published: November 25, 2025 | arXiv ID: 2511.20835v1

By: Gao Wang , Yingying Huang , Lars Muckli and more

Potential Business Impact:

Draw pictures with your thoughts, then AI makes them real.

Business Areas:
Intelligent Systems Artificial Intelligence, Data and Analytics, Science and Engineering

Brain-computer interfaces (BCIs) are evolving from research prototypes into clinical, assistive, and performance enhancement technologies. Despite the rapid rise and promise of implantable technologies, there is a need for better and more capable wearable and non-invasive approaches whilst also minimising hardware requirements. We present a non-invasive BCI for mind-drawing that iteratively infers a subject's internal visual intent by adaptively presenting visual stimuli (probes) on a screen encoded at different flicker-frequencies and analyses the steady-state visual evoked potentials (SSVEPs). A Gabor-inspired or machine-learned policies dynamically update the spatial placement of the visual probes on the screen to explore the image space and reconstruct simple imagined shapes within approximately two minutes or less using just single-channel EEG data. Additionally, by leveraging stable diffusion models, reconstructed mental images can be transformed into realistic and detailed visual representations. Whilst we expect that similar results might be achievable with e.g. eye-tracking techniques, our work shows that symbiotic human-AI interaction can significantly increase BCI bit-rates by more than a factor 5x, providing a platform for future development of AI-augmented BCI.

Page Count
16 pages

Category
Quantitative Biology:
Neurons and Cognition