Control of a commercial vehicle by a tetraplegic human using a bimanual brain-computer interface
By: Xinyun Zou , Jorge Gamez , Meghna Menon and more
Potential Business Impact:
Lets paralyzed people drive cars with their minds.
Brain-computer interfaces (BCIs) read neural signals directly from the brain to infer motor planning and execution. However, the implementation of this technology has been largely limited to laboratory settings, with few real-world applications. We developed a bimanual BCI system to drive a vehicle in both simulated and real-world environments. We demonstrate that an individual with tetraplegia, implanted with intracortical BCI electrodes in the posterior parietal cortex (PPC) and the hand knob region of the motor cortex (MC), reacts at least as fast and precisely as motor intact participants, and drives a simulated vehicle as proficiently as the same control group. This BCI participant, living in California, could also remotely drive a Ford Mustang Mach-E vehicle in Michigan. Our first teledriving task relied on cursor control for speed and steering in a closed urban test facility. However, the final BCI system added click control for full-stop braking and thus enabled bimanual cursor-and-click control for both simulated driving through a virtual town with traffic and teledriving through an obstacle course without traffic in the real world. We also demonstrate the safety and feasibility of BCI-controlled driving. This first-of-its-kind implantable BCI application not only highlights the versatility and innovative potentials of BCIs but also illuminates the promising future for the development of life-changing solutions to restore independence to those who suffer catastrophic neurological injury.
Similar Papers
Real-Time Brain-Computer Interface Control of Walking Exoskeleton with Bilateral Sensory Feedback
Neurons and Cognition
Lets paralyzed people walk and feel again.
Intuitive control of supernumerary robotic limbs through a tactile-encoded neural interface
Robotics
Lets people control extra robot arms with their minds.
EEG-based AI-BCI Wheelchair Advancement: Hybrid Deep Learning with Motor Imagery for Brain Computer Interface
Machine Learning (CS)
Lets people move wheelchairs with their thoughts.