
By taking advantage of the brain’s plasticity, patients may be able to integrate neuroprosthesis as parts of their body image. Image courtesy of the laboratory of Miguel Nicolelis, MD, PhD, Duke University.
The brain’s tactile and motor neurons, which perceive touch and control movement, may also respond to visual cues, according to researchers at Duke University School of Medicine (Duke Medicine), Durham, North Carolina. The study in monkeys, which was published online August 26 in the journal Proceedings of the National Academy of Sciences, provides new information on how different areas of the brain may work together in continuously shaping the brain’s internal image of the body, also known as the body schema. The findings have implications for individuals with paralysis using neuroprosthetic limbs.
“The study shows for the first time that the somatosensory or touch cortex may be influenced by vision, which goes against everything written in neuroscience textbooks,” said senior author Miguel Nicolelis, MD, PhD, professor of neurobiology, biomedical engineering, and psychology and neuroscience codirector of the Center for Neuroengineering, Duke Medicine. “The findings support our theory that the cortex isn’t strictly segregated into areas dealing with one function alone, like touch or vision.”
Earlier research has shown that the brain has an internal spatial image of the body, which is continuously updated based on touch, pain, temperature, and pressure-known as the somatosensory system-received from skin, joints, and muscles, as well as from visual and auditory signals. An example of this dynamic process is the “rubber hand illusion,” a phenomenon in which people develop a sense of ownership of an artificial hand when they view it being touched at the same time that something touches their own hand.
In an effort to find a physiological explanation for the “rubber hand illusion,” Duke researchers focused on brain activity in the somatosensory and motor cortices of monkeys. These two areas of the brain do not directly receive visual input, but previous work in rats, conducted at the Edmond and Lily Safra International Institute of Neuroscience of Natal, Brazil, theorized that the somatosensory cortex could respond to visual cues. In the Duke experiment, the two monkeys observed a realistic, computer-generated image of a monkey arm on a screen being touched by a virtual ball. At the same time, the monkeys’ arms were touched, triggering a response in their somatosensory and motor cortical areas. The monkeys then observed the ball touching the virtual arm without anything physically touching their own arms. Within a matter of minutes, the researchers saw the neurons located in the somatosensory and motor cortical areas begin to respond to the virtual arm alone being touched.
The responses to virtual touch occurred 50 to 70 milliseconds later than physical touch, which is consistent with the timing involved in the pathways linking the areas of the brain responsible for processing visual input to the somatosensory and motor cortices. Demonstrating that somatosensory and motor cortical neurons can respond to visual stimuli suggests that cross-functional processing occurs throughout the primate cortex, through a highly distributed and dynamic process.
“These findings support our notion that the brain works like a grid or network that is continuously interacting,” Nicolelis said. “As we become proficient in using tools-a violin, tennis racquet, computer mouse, or prosthetic limb-our brain is likely changing its internal image of our bodies to incorporate the tools as extensions of ourselves.”
Editor’s note: This story was adapted from materials provided by Duke Medicine.