Recently, we discussed the developments from the Wadsworth Center of a minimally-invasive, thin-film technology to enhance electrocorticography (ECoG) recordings (read). Similar to the more common electroencephalogram (EEG) method, which uses an array of electrodes stuck on your outer skull to receive electrical signals from your neurons, the ECoG uses an array of electrodes embedded just on the surface of your brain allowing for a more direct electrical view of neural activity. This view still covers an averaged signal from a large number of talking neurons and still does not see individual electrical signals. However, by having the bony skull out of the way, the electrodes sure have a more clear shot for picking up the electric fields.
The importance of this work from Wadsworth is that the brain and it’s violent bodyguard, the immune system, doesn’t really like to have things hanging around that the body didn’t make on its own. So, typical implanted devices will quickly be destroyed by attacking antibodies. Here, the specialized implanted ECoG devices are lasting six to twelve months in human patients, but their goal is to improve the device life-cycle to five to ten years.
Through their collaboration with clinical neurologists and biomedical engineers at Washington University in St. Louis, Missouri, the Wadsworth group, lead by Gerwin Schalk, is taking the technology to the next step by integrating the recording activity with specialized software that maps the brain activity with computer control. The implanted ECoG providing its more detailed map of brain activity allows for a specific correlation to be observed between physically clicking a computer mouse button, for example, and the resulting pattern of neural firings in the brain. The patient can then train their thoughts to reproduce similar neuron activity and, with a direct connection to the computer, the mouse click appears without the click.
The interfacing process is being licensed to a start-up company in St. Louis called Neurolutions, who will be working to improve the software and training process to bring it to market for applications in neuroprosthetics. The challenge for further advancement begins with the unfortunately situation that just clicking a mouse button doesn’t get us very far in life. Just moving fingers and arms requires multi-dimensional spatial control, and with that comes an an unknown number of different neural patterns being required to simply raise your arm to reach the mouse on top of the desk. All of the corresponding neural activity–move shoulder up, rotate elbow, lift index finger, shift arm to the right, etc.– will need to be mapped, trained, and accessed to control a prosthetic device… and each human might have different neural patterns for the same physical motion.
“Reading the Surface of the Brain” :: Technology Review :: June 3, 2009 [ READ ]
“Brain-Computer Interface Technology Licensed to Missouri Firm” :: NY State Dept. of Health Press Release :: March 25, 2009 :: [ READ ]