Researchers have been working for decades to develop an interface that would allow the human brain to control prosthetic limbs.
José Luis Contreras-Vidal, PhD a professor of electrical and computer engineering with the University of Houston (UH) Cullen College of Engineering, Texas, and director of the Laboratory of Non-Invasive Brain-Machine Interface Systems, has made strides in developing a brain-machine interface that is simpler than what was once predicted. He hopes to demonstrate a human walking with an exoskeleton controlled by this brain-machine interface in a matter of months.
Contreras-Vidal’s published paper in the March 2012 special issue of IEEE Transactions on Neural Systems and Rehabilitation Engineering shows his latest findings in this arena.
For years, according to Contreras-Vidal, most researchers believed that decoding movement intentions in the brain would require invasive technologies, such as electrodes implanted in the skull. “But we have been going against dogma for a long time,” said Contreras-Vidal, whose earlier research demonstrated that movement intentions related to the legs (such as walk, turn, and sit down) can be decoded with high accuracy through a skullcap of electrodes that records the brain’s electrical activity via electroencephalography (EEG).
Those earlier studies used 64 electrodes to decode moment intention. In his latest paper, Contreras-Vidal has demonstrated that same capability with 12 electrodes. Such a reduction, Contreras-Vidal said, could allow for the creation of an interface system that is less intrusive and easier for the user to work with than previously thought.
The paper also showed this research group has developed the ability to connect movement intentions that integrate feedback from stimuli like visual information during walking or proprioception with the movement intention that preceded it. This advance should allow Contreras-Vidal and his colleagues to devise a brain-machine interface that can more closely mimic the near-automatic response able-bodied individuals have to the unexpected, such as stumbling.
“This is an important capability for any robotics system,” he said. “If something goes wrong, if the feedback is not expected, you need to be able to make quick corrections.”