
Forney, left, and Vafaei demonstrate the EEG collection process. Photograph by Kevin Olson, courtesy of CSU.
Researchers with the Brain-Computer Interfaces (BCI) Laboratory at Colorado State University (CSU) are developing computer software that can learn to make accurate predictions from patterns of neural activity that are produced when people perform, or visualize performing, a specific task. The neural activity is extracted from electrodes placed on the scalp. Elliott Forney, MSc, and Fereydoon Vafaei, MSc, both doctoral students in computer science, are developing the software that they say could have implications for the control of assistive robots or robotic arms by individuals with loss of muscle control. The long-term goal of the research is to create systems that can make decisions and adapt in real-time for people who are unable to perform basic activities of daily living, reported the Rocky Mountain Collegian.
The researchers collect and mine EEG data to find patterns of neural activity to distinguish between tasks such as moving a wheelchair or adjusting the thermostat. The software will be able to make accurate predictions for tasks based on mistakes it made in the past. “If the system can identify what it is doing incorrectly, it can learn not to do that in the future,” Forney told the Collegian.
Vafaei demonstrated the system for the Collegian by wearing an EEG signal-capturing cap and thinking about closing his left or right fist. Based on the pattern of his neural activity, the software predicted which fist he wanted to close. If the computer selected the wrong fist, he altered how he performed the mental task to help the computer make a more accurate prediction. “The simultaneous learning between the computer and the person is going to be necessary (for real-world application),” Charles Anderson, PhD, professor in the Department of Computer Science and researcher in the BCI Lab was quoted as saying.