José del R. Millán, PhD, a researcher at the Swiss Federal Institute of Technology, Lausanne, Switzerland, gave a presentation at the Cognitive Neuroscience Society conference in San Francisco on March 30 titled “The Rise of Neuroprosthetics: The Perception-Action Closed Loop,” which outlined the work he and his colleagues are undertaking on brain-controlled interfaces (BCIs). Millán began his career designing autonomous robots that could learn from their own experiences. He then became interested in having these robots help people with disabilities in a “very natural, direct, and intuitive way,” he said. This interest led him to design devices that use people’s own brain activity to restore hand grasping and locomotion or provide mobility via wheelchairs or telepresence robots.
In his and his colleagues’ latest work, they tested a variety of brain-controlled devices on people with motor disabilities, in some cases quite severe ones. The participants successfully completed tasks ranging from writing to navigation at similar levels of performance as healthy control groups. The individuals operated the devices by voluntarily and spontaneously modulating their EEG activity to deliver commands. EEGs have the benefit that they can be recorded noninvasively through probes on the scalp, rather than requiring surgery or sophisticated machinery. “It also provides a global picture of our brain patterns, what is necessary to decode all the variety of neural correlates we want to exploit,” Millán explained.
The participants needed a training period of no more than nine sessions before being able to operate the devices. And those using telepresence robots were able to successfully navigate through environments they had never visited. Key to their success, Millán says, was the concept of shared control-using robots’ sensory capabilities to interpret the users’ command in context.
The BCI processes users’ intentions and decision making mainly from the cerebral cortex. But, Millán noted that many elements of skilled movements are handled in the brainstem and spinal cord. By designing the intelligent device to control the lower-level movements in concert with the higher-level brain activity from the BCI, the neuroprostheses come closer to natural motor control. “We aim to interact with these neuroprostheses as if they were our new body, using the very same neural signals and principles that control our muscles,” Millán said.
As an example of the new types of neuroprostheses, Millán pointed to a brain-controlled wheelchair he and colleagues designed and published a paper about last year. Its users can drive it reliably and safely over long periods of time as a result of the shared control system reducing the cognitive workload. The wheelchairs are currently in the evaluation phase to make sure that they will work in daily life conditions for a large number of people with motor disabilities.
Two of the biggest challenges for neuroprosthetics are finding new physical interfaces in addition to EEG that can operate permanently and over long periods of time, as well as provide rich sensory feedback. “The third major challenge is…[that] we must decode and integrate in the prosthetic control loop information about perceptual cognitive processes of the user that are crucial for volitional interaction,” Millán said. These processes include awareness of errors made by the device, anticipation of critical decision points, and lapses of attention.
“Future neuroprostheses-robots and exoskeletons controlled via a BCI-will be tightly coupled with the user in such a way that the resulting system can replace and restore impaired limb functions because it will be controlled by the same neural signals as their natural counterparts,” Millán said.
Editor’s note: This story was adapted from material written by Lisa M.P. Munoz with the Cognitive Neuroscience Society.