New prosthetic interfaces will be tested through virtual reality and driving simulations.
Image credit: Dharmesh Patel/Texas A&M Engineering.
A research team led by Maryam Zahabi, PhD, assistant professor in the Department of Industrial and Systems Engineering at Texas A&M University, is studying machine-learning algorithms and computational models to provide insight into the cognitive load placed on people using pattern recognition prosthetic devices and to improve the interfaces.
The researchers are studying EMG-based human-machine prosthetic interfaces that translate electrical signals into a pattern of commands that allow the user to move the device. Testing interface prototypes through virtual reality and driving simulations will allow the researchers to provide guidance to the engineers creating the interfaces.
“Currently there is very little guidance on which features in EMG-based human-machine interfaces are helpful in reducing the cognitive load of patients while performing different tasks,” Zahabi said.
The research is a collaboration between Texas A&M, North Carolina State University and The University of Florida and is supported by the National Science Foundation.
Editor’s note: This story was adapted from materials provided by Texas A&M University.
[Source/Image: New algorithms improve prosthetics for upper limb amputees]