An assistant professor of computer science at Memorial University in Canada was awarded $237,750 (about $176,000 USD) to create an artificial intelligence interface for controlling prosthetic hands. The government of Canada’s New Frontiers in Research Fund (NFRF) gave the funding to Xianta Jiang, PhD, part of the faculty of Engineering and Applied Science at the university, for the control system that would allow people to operate an artificial limb as easily as their intact hand without requiring targeted muscle reinnervation surgery.
The control interface will work with commercially available and customized prosthetic hands. The research team hopes the project will decrease the rejection rate of prosthetic hand use.
“Current state-of-the-art, noninvasive prosthesis control systems use pattern recognition techniques driven by surface muscle signals,” said Jiang. “This requires the user to carefully exert distinct muscle signal patterns to perform different gestures. However, in real-life situations, intact people rarely have to think about their hand gestures when grabbing an object; instead, the fingers and the hand are naturally configured to the proper posture when the hand reaches and touches a target object.”
The bio-inspired interface features the addition of miniature cameras and tactile sensors to the prosthetic hands.
“These additions will enable the robotic hand to ‘see’ the target and ‘feel’ the environment and the object during the reach-and-grasp process, and automatically drive the hand towards grasping with little control effort from the user. The amputees only need to decide whether to proceed or retrieve the robotic hand.”
The project is interdisciplinary, involving researchers in mechanical, electrical, computer engineering and computer science, and rehabilitation science, kinesiology, and psychology.
The goals are to enable prosthetic hands with vision and haptic functions using computer vision and tactile sensing techniques; explore the best prosthetic hand control strategies to achieve high-accuracy movement with minimum control effort from the user; and develop an easy and natural prosthetic hand control interface by fusing multiple inputs from computer vision, touch sensing, and muscle signals.
Editor’s note: This story was adapted from materials provided by Memorial University.