Engineering researchers at four U.S. universities are embarking on a four-year project to design a prosthetic arm that amputees can control directly with their brains and that will allow the user to feel what the prosthetic arm touches. The researchers say much of the technology has already been proven in small-scale demonstrations.
The research at Rice University (Rice), Houston, Texas; the University of Michigan (U-M), Ann Arbor; Drexel University (Drexel), Philadelphia, Pennsylvania; and the University of Maryland (UMD), College Park, is made possible by a $1.2 million grant from the National Science Foundation’s (NSF) Human-Centered Computing program.

Harsha Agashe, a doctoral student in Contreras-Vidal’s lab, wears the brain cap, a noninvasive, sensor-lined cap with neural interface software. Photograph courtesy of John Consoli, University of Maryland.
Study co-investigator José L. Contreras-Vidal, PhD, associate professor in the UMD School of Public Health Kinesiology Department, Neural Engineering and Smart Prosthetics Research Lab, and his UMD team are developing “brain-cap” technology that taps into the user’s neural network by using a noninvasive cap of electrodes that read electrical activity on the scalp via electroencephalography (EEG) and translates that activity into movement commands for computers and other devices. This technology soon could be used to control computers, robotic prosthetic limbs, motorized wheelchairs, and even digital avatars.
Contreras-Vidal, who is also an affiliate professor in UMD’s Fischell Department of Bioengineering and the Neuroscience and Cognitive Science Program, has previously demonstrated technology that allowed test subjects to move a cursor on a computer screen simply by thinking about it and has successfully used EEG brain signals to reconstruct the 3D movements of the ankle, knee, and hip joints during human treadmill walking. Using functional near-infrared (fNIR) technology developed by Drexel’s Optical Brain Imaging Laboratory, the team plans to combine the EEG information with real-time data about blood-oxygen levels in the user’s frontal lobe.
“We want to provide intuitive control over contact tasks, and we’re also interested in strengthening the motor imagery the patients are using as they think about what they want their arm to do,” said study co-investigator Patricia A. Shewokis, PhD, professor, School of Biomedical Engineering, Science & Health Systems, College of Nursing and Health Professions, Drexel. “Ideally, this tactile or haptic feedback will improve the signal from the EEG and fNIR decoder and make it easier for patients to get their prosthetic arms to do exactly what they want them to do. We are moving toward incorporating the ‘brain in the loop’ for prosthetic use and control.”
There are other BCI technologies under development, but Contreras-Vidal noted that these competing technologies are either invasive, requiring electrodes to be implanted directly in the brain, or, if noninvasive, require much more user training than does UMD’s EEG-based, brain-cap technology. Contreras-Vidal is the only researcher to have demonstrated decoding results using noninvasive neural interfaces that are comparable to those achieved by researchers using implanted electrodes, according to a UMD press release.
The four co-investigators on this project-who also include Marcia K. O’Malley, PhD, associate professor of mechanical engineering and materials science at Rice, and Brent Gillespie, PhD, U-M associate professor of mechanical engineering-have previously demonstrated technology that allows amputees to correctly perceive and manipulate objects with a prosthetic gripper based upon sensory feedback that was provided in a natural way to the remaining portion of their limbs.
“The investigators on this grant have already demonstrated that much of this is possible,” O’Malley said. “What remains is to bring all of it-noninvasive neural decoding, direct brain control, and tactile sensory feedback-together into one device.”
The team plans to incorporate technology that feeds both tactile information from the prosthetic fingertips and grasping-force information from the prosthetic hand via a robotic exoskeleton and touchpads that vibrate, stretch, and squeeze the skin where the prosthesis attaches to the body.
O’Malley said the new technology is a big leap over what is being used in existing prosthetic devices, which don’t allow amputees to feel what they touch. Some state-of-the-art prostheses today use force-feedback systems that vibrate-much like the vibrate mode on a mobile phone-to provide limited information about objects a prosthetic hand is gripping. However, the prosthesis user must also rely on visual feedback to determine whether the object is soft or hard and how tightly they are grasping it.
“Sensory feedback, especially haptic feedback, is often overlooked, but we think it’s the key to closing the loop between the brain and motorized prosthetic devices,” Gillespie said. “These results indicate that we stand a very good chance to help amputees and also help others who may be suffering from motor impairments.”
Editor’s note: This story has been adapted from materials provided by Rice University and the University of Maryland.