Researchers at the Fraunhofer Institute for Biomedical Engineering IBMT, a German-based research institution in biomedical engineering and biotechnology, are working as part of a research project to improve control of prosthetic hands down to individual fingers. Seven partners from five countries are working together as part of the SOMA consortium.
Instead of conventional electrodes that detect nerve impulses in muscle tissue in the arm, the prosthesis relies on ultrasonic sensors, which means commands can be executed with far greater accuracy and sensitivity. In the next stage, researchers want to make the design bidirectional, with the brain also receiving sensory stimuli from the prosthesis.
The SOMA project (Ultrasound peripheral interface and in-vitro model of human somatosensory system and muscles for motor decoding and restoration of somatic sensations in amputees), still in the laboratory phase, is using ultrasonic sensors that continuously send sound pulses into the muscle tissue in the forearm. Unlike electrical impulses, sound waves are reflected by tissue. The time taken for the reflected signals to propagate provides information about the physical depth of the muscle strand that is reflecting the respective sound wave. This allows contractions in the muscle tissue triggered by nerve stimuli in the brain to be studied in great detail. This in turn means typical activation patterns in the muscle, ones that represent specific hand or finger movements, can be identified.
The aim of the project is for the software, controlled by artificial intelligence (AI), to take over the job of identification.
The electronics could send the decoded signals as a command to the actuators in the prosthetic hand, thus triggering movement of the prosthetic fingers. Control commands are detected, analyzed, and transmitted in real time.
Ultrasonic transducers and electronics generate signals and decode the sound waves that are reflected back. This data is then passed to a personal computer where the AI starts analyzing. The electronics then send decoded signals as a command to the actuators in the prosthetic hand, thereby triggering finger movement.
“The ultrasonic-based control acts with greater sensitivity and accuracy than would be possible with electrodes. The sensors are able to detect varying degrees of freedom such as flexing, extending, or rotating,” said Marc Fournelle, PhD, head of the Sensors & Actuators group at Fraunhofer IBMT, who is responsible for developing SOMA ultrasonic sensors within the project.
To link the muscle signals correctly with the right finger and desired movement, subjects complete a short training session where they try to move various parts of the hand and fingers. The activity patterns generated in this way are stored as a base reference in the system. This means a link can be established between the corresponding finger or part of the hand, and the desired movement. Training takes just a few minutes.
“Trials on test subjects have shown that the technology works. It is very easy to use and noninvasive. We are now working on making the system even more inconspicuous,” said Andreas Schneider-Ickert, project manager in the Active Implants unit and innovation manager at Fraunhofer IBMT.
In the next stage, researchers want to improve the temporal resolution of the sensors further and make the electronics smaller so that the prosthesis can be controlled even more accurately and comfortably. The sensor bracelet will be hidden away in the cuff of the prosthetic hand.
Thinking of improved suitability for everyday use, it is also conceivable that the AI and control software may one day be integrated into a smartphone. For instance, the researchers said, after being decoded by the electronics box, signals might be transmitted to the smartphone and back using Bluetooth.
The feedback could be delivered via electrodes implanted in or onto nerves. From there they transmit signals sent by the prosthesis to the brain as a sensory stimulus in the form of specific nerve stimulation. This means the person’s brain is getting feedback from the artificial hand and can send back commands that, for example, tighten or loosen the fingers. The person cannot feel the electrode, which is made of biologically compatible material and implanted in the nerve tissue.
“This means a closed loop is set up, where the brain and prosthetic hand are communicating with each other constantly and in real time,” said Fournelle.