<table class="clsTableCaption k-table" style="float: right;"> <tbody> <tr> <td><img src="https://opedge.com/Content/OldArticles/images/2010-04_02/04-02_01.jpg" alt="" /></td> </tr> <tr> <td>Robin af Ekenstam experiences a sense of touch in his phantom hand while using the sensor-packed SmartHand. <em>SmartHand photos courtesy of YourIs.com</em></td> </tr> </tbody> </table> <strong>November 2009:</strong> Cobbled onto Robin af Ekenstam's right arm is a crude-looking hand prototype bristling with exposed motors, loopy wiring, and aluminum fingers that appear to be capped with pencil erasers. As the young Swede takes the hand through a halting regimen of myoelectric-controlled movements, it emits a distinct mechatronic whine. The prototype's parts don't look particularly well tethered together, and the whole thing could well belong to a defunct Terminator whose skin had seen the wrong end of John Connor's flamethrower. However, this whirring assemblage is not only working in tuned harmony, it's coordinating with both an artificial-intelligence neural net and Ekenstam's own nervous system to give the Swedish amputee an experience he hasn't had in more than two years: feeling in his right hand. "When I grab something tightly, then I can feel it in the fingertips, which is strange because I don't have them anymore," Eckenstam told a BBC film crew in November 2009. "It's amazing." This ultramodern dance of seemingly disparate parts— electronic, biologic, and digital—is the result of a far larger act of cooperation, one that is helping to create a future in which prosthetics restore not only function and appearance, but also realistic sensation. <h2><span style="font-size: 14pt;">27 Countries, $93 Billion</span></h2> <strong>Today:</strong> The European Union (EU), whose funding agencies made Ekenstam's test experience with the prototype possible, is a research powerhouse. Though its scientists are spread out among 27 nations, speaking 23 different languages, they have joined forces in collaborative groups since 1984 to work under EU research initiatives called the Framework Programs (FPs). These five-year funding pushes proffer the international scientific community goals for research and provide the funds to carry it out. The 6th Framework Program (FP6) began in 2002 and ended in 2006, with collaborating teams using more than $23 billion in FP6 funds. From FP7 onward, the programs will last seven years each, so FP7 began in 2007 and will end in 2013. More than $70 billion in research money is budgeted for cooperative research during that period. This article explores just a few of the fruits of this cooperation—projects in various stages whose proponents came together from around the continent to develop vanguard capacities for prosthetic sensitivity and control. <h2><span style="font-size: 14pt;">Why Pursue a Prosthetic Sense of Touch?</span></h2> A team led by Maria Carrozza, PhD, stated in a 2009 IEEE annual conference presentation, "The design and development of a prosthetic artificial hand should aim as much as possible at replacing both functionality and cosmetic appearance of the natural hand lost by the amputee. Surveys on using commercial prosthetic hands reveal that 30 to 50 percent of upper-extremity amputees do not use their prosthetic hand regularly. The main factors for this are low functionality and controllability, poor cosmetic appearance, and an unnatural control system, which make the hand to be felt as an external device that is not part of the subject's body scheme." According to research by a cohort of EU scientists including Alan M. Wing, professor of human movement at the University of Birmingham, England, sense of touch influences not only our sense of ownership of our own body parts but also a yet uncounted number of fundamental physical functions, including balance and the process of moving in harmony with our physical surroundings. Might wearers be more likely to benefit from an upper-limb prosthesis if their device replaced a fuller complement of functions, including functional movement and functional sensations, such as pressure and heat, but also sensuality—the pleasure of physical touch? FP6 and FP7 researchers across Europe have been working to find out. <h2><span style="font-size: 14pt;">CyberHand</span></h2> <table class="clsTableCaption k-table" style="float: right;"> <tbody> <tr> <td><img src="https://opedge.com/Content/OldArticles/images/2010-04_02/04-02_02.jpg" alt="" /></td> </tr> <tr> <td>The Prensilia open hand-platform is available to researchers custom-built. <em>Photograph courtesy of Prensilia.</em></td> </tr> </tbody> </table> Many of the ventures toward this knowledge derive from the EU-funded CyberHand project. The CyberHand is a fully articulated, nonwearable, experimental hand prototype. Carrozza, a CyberHand project researcher, described it in her September 2009 IEEE presentation by saying, "The CyberHand, developed by ARTS Lab, Scuola Superiore Sant'Anna, Italy, can be defined as a stand-alone prosthetic-hand open platform. It is composed of five underactuated anthropomorphic fingers...which are actuated by six DC motors. Five of them are employed for the flexion/extension of the fingers, and a further one drives the thumb opposition and is housed inside the palm. Since each finger is composed of three phalanxes, the mechanical architecture presents 16 DOF [degrees of freedom] actuated by six degrees of motion (DOM).... When the hand closes, it is able to wrap over objects, thus obtaining grasps with a high number of contact points." Multiple hand projects based on CyberHand technology are in progress worldwide, and in 2010, prototypes based on the CyberHand became available for general sale to researchers via an Italian company called Prensilia. Founded by ARTS Lab researcher Christian Cipriani, PhD, Prensilia offers custom-built, self-contained, nonwearable prosthetic hands and open hand-platforms ranging in price from about $30,000 to $47,000. <h2><span style="font-size: 14pt;">SmartHand</span></h2> <table class="clsTableCaption k-table" style="float: right;"> <tbody> <tr> <td><img src="https://opedge.com/Content/OldArticles/images/2010-04_02/04-02_03.jpg" alt="" /></td> </tr> <tr> <td>The SmartHand.</td> </tr> </tbody> </table> One of the CyberHand's progeny is the SmartHand, Robin af Ekenstam's test hand. The SmartHand project is coordinated by Assistant Professor Fredrik Sebelius of Lund University, Sweden, and involves practitioners from a variety of fields and five nations. SmartHand has evolved away from the CyberHand platform both by becoming a wearable early prototype hand and by including a bidirectional interface. The SmartHand interface's sensing process is different from targeted muscle reinnervation (TMR) as developed by Todd Kuiken, MD, Sebelius notes. TMR requires the surgical relocation of nerves to the chest, while the SmartHand's interface is currently non-invasive and has only been tested on transradial amputees' existing nerve terminations along the residual forearm. In the SmartHand's current interface, 16 myoelectric sensors pick up the wearer's intentions from micromovements of the forearm muscles and drive the hand. Five servo motors, one for each finger, provide feedback to the wearer by pushing on the forearm's skin at the sites of nerve endings that report to the brain's "map" of the phantom hand. "In the brain, you have the sensory cortex," Sebelius explains, "and the sensory cortex area for the hand is neatly organized. The little finger is on top; the ring, middle, and index fingers are on the bottom; and then you have the thumb in the middle.... The concept is that by stimulating this map with the correct sensation through the arm, the subject will hopefully get sensation that feels intuitive. It should feel like you're actually touching a real finger. "The SmartHand feeds the sensation back onto the forearm by the sensory feedback system. You have force sensors and tension sensors in the hand, so if you are touching something, you get pressure indication in the robotic hand and that is then fed back to the servo motors on the forearm that are pushing down onto the skin proportionally. A small push on the Smart-Hand's index finger relates to a small push somewhere on the forearm." This transfers the information through the nerve and up to the brain as if the touch were happening on the amputated hand itself. SmartHand testers like Ekenstam have reported that they can literally feel with the SmartHand. However, sensation without control does little good. To train both the sensory and control processes, Sebelius' team used a data glove—the Cyberglove by Virtual Technologies, Palo Alto, California—to map electromyography (EMG) signals to hand movements. The Cyberglove is worn on the tester's healthy hand, where the glove's finger-joint angle sensors read its movement and help map it to EMG signals being emitted at the same time by the nerves in the residual limb. According to a November 2009 <em>IEEE Transactions On Biomedical Engineering</em> paper by Cipriani et al., the wearer performs synchronous movements with the healthy and phantom hand while software maps the nerve signals generated by the movement and learns to predict future hand movements based on incoming EMG signals. <h2><span style="font-size: 14pt;">LifeHand</span></h2> The LifeHand, a non-FP Italian project, is also based on the CyberHand platform. In 2009, an amputee volunteer agreed to have a series of hair-thin temporary electrodes implanted into the nerves of his residual forearm for four weeks, during which time he reportedly learned to control the LifeHand by thought alone. No one at the LifeHand project, based out of Campus Bio-Medico University (UNICAMPUS) of Rome, responded to requests to contribute to this article. <h2><span style="font-size: 14pt;">NanoBioTact and NanoBioTouch</span></h2> NanoBioTact is an FP6 project with two complementary purposes: to "analyze and understand the working principles of the natural sense of touch," and "to take inspiration from this knowledge to design novel and better tactile artificial sensory systems and robotics," according to Mike Adams, FREng, NanoBioTact project coordinator and professor at the University of Birmingham's School of Chemical Engineering, England. The NanoBioTact project brought together researchers in eight different fields—neurophysiology, psychology, information processing, numerical simulation, skin mechanics/tribology, nanotechnology, tissue engineering, and robotics—to complete these two goals. For the first, team members used electroencephalography (EEG) and microneurography (MRI) to measure afferent neural signals when volunteers rubbed a fingertip over textured surfaces. For the second, the team developed an artificial finger pad that incorporated microelectrical mechanical systems (MEMS) sensors. The finger pad was meant to replicate the functions of a natural fingertip during the touch process. <table class="clsTableCaption k-table" style="float: right;"> <tbody> <tr> <td><img src="https://opedge.com/Content/OldArticles/images/2010-04_02/04-02_05.jpg" alt="" /></td> </tr> <tr> <td>This tactile stimulator was built at ARTS Lab, Scuola Superiore Sant'Anna, Pisa, Italy, and used on human subjects at the University of Göteborg, Göteborg, Sweden, as part of NanoBioTact. Scientists record the neural responses of human fingertip receptors when a volunteer's fingertip is stimulated with different surfaces in order to identify the peripheral neural response to surface texture.</td> </tr> </tbody> </table> "In current technology in robotics and prosthetics, you might have force sensors so that you don't crush a coffee cup or whatever, but you still can't actually feel the topography of the surface—whether it's smooth or rough, wet or dry," Adams says. To develop that capacity, the NanoBioTact team needed to gather knowledge about how the body comprehends external texture, and the team assembled five of what it calls "work packages" through which to explore the question. The first work package involved understanding the tribology —the science and technology of interacting surfaces in relative motion—of the finger pad. "The finger pad has fingerprint ridges on the surface," Adams says, "and a large number of sweat pores. The skin's friction varies depending on its moisture content...so when you slide the finger pad on a smooth surface, it's in an occluded state—moisture is being transported from the skin that can't escape, and over a period of ten or 20 seconds, the coefficient of friction is increasing. The behavior can be rather different for rough surfaces. That concept is part of understanding the interaction between the real world and the human body. It starts with that interaction between the finger pad and a surface." The second work package explored information processing—using EEG leads to connect a volunteer's nervous system to an artificial-intelligence neural network while the volunteer touched various surfaces with the fingertip. The EEG signals were then processed so that the neural net "learned" which surfaces produced which nerve signals. The neural net then learned to recognize similar data delivered from an electronic sensor. The third work package explored microneurography—using a needle sensor invasively to measures the electrical response of particular mechanoreceptors while the volunteer slid his or her finger pad over different surfaces, then measuring the way in which the EEG-detected action potentials changed. The fourth work package involved developing a computational model of the finger pad. "It's a scale model in the sense that the fingerprint really is modeled in great detail, as is its overall deformation," Adams says. "There's an array of four sensors packaged in a rubber matrix or tissue-engineered skin. Each sensor has four transducers, which are a bit like a cross with a post sticking up. Each of the posts has a piezoresistive device on it. <em>(Editor's note: Piezoresistance, according to the American Institute of Physics, is a phenomenon "in which a resistance change, prompted by a change in another physical parameter, can be used in making sensitive sensors.")</em> If something touches the post, you can get information.... The model finger pad is slid over different surfaces using a variety of measuring devices, and signals from those MEMS devices can be used to interpret the nature of the surface over which they're sliding." <table class="clsTableCaption k-table" style="float: right;"> <tbody> <tr> <td><img src="https://opedge.com/Content/OldArticles/images/2010-04_02/04-02_04.jpg" alt="" /></td> </tr> <tr> <td>The NanoBioTact artificial tactile fingertip built at ARTS Lab, Scuola Superiore Sant'Anna, Pisa, Italy, includes a tactile sensor array, and it can identify different surfaces by their varying roughness. <em>NanoBioTact photographs courtesy of ARTS Lab, Scuola Superiore Sant'Anna.</em></td> </tr> </tbody> </table> The next stage of the project is part of the FP7 initiative. It's called NanoBioTouch, and it also requires two tasks. The first is to grow a living biological sensor onto a MEMS device, Adams says. Creating the nutrient-support systems for such a sensor is long in the future, but if it works, a biological sensor "should give us a more interesting range of responses and be closer to the natural system." "The second area of study is what's called affective touch, which is defined as the assessment of pleasantness or the preference for a particular tactile experience," Adams continues. "In NanoBioTact, we were telling whether something was rougher or smoother than something else. In NanoBioTouch, with affective touch, we'll be assessing if something is pleasant or unpleasant, or if you prefer one thing over another." Spending millions of dollars to develop a prosthesis that can restore the capacity to perceive subjective preferences might seem less valuable than some other research priorities. However, in <em>The Spell of the Sensuous,</em> David Abrams, PhD, asserts the value of physical sensitivity for experiencing the universe: "A sensing body's actions are never wholly determinate, since they must ceaselessly adjust themselves to a world and a terrain that is itself continually shifting. [Without the senses] all of its experiences and all its responses would already have been...programmed, as if into a machine. But could we then even call them experiences? For is not experience, or more precisely, perception, the constant thwarting of such closure?" <em>Morgan Stanfield can be reached at </em> <div style="background: #EFEFEF; padding: 10px 10px 5px 10px;"> <h2><span style="font-size: 14pt;">Coming Soon to a Residual Limb Near You</span></h2> <table class="clsTableCaption k-table" style="float: right;"> <tbody> <tr> <td><img src="https://opedge.com/Content/OldArticles/images/2010-04_02/04-02_06.jpg" alt="" /></td> </tr> <tr> <td><em>Photograph courtesy of Touch Bionics.</em></td> </tr> </tbody> </table> Universities aren't the only groups in Europe working on innovative projects. In the commercial world, <strong>Touch Bionics</strong>, Livingston, Scotland, is preparing to release in June the i-LIMB Pulse, the latest version of its i-LIMB Hand. According to Phil Newman, Touch Bionics' marketing director, the Pulse was "completely redesigned from the original i-LIMB." Bluetooth connectivity can link the Pulse to a PC-based software package called BioSim; prosthetists can use the software to program default grips into the hand according to the user's preference and to activate the "pulse" option. This option allows users to increase the strength of a grip through successively timed signals until the desired intensity is achieved. In an effort to serve more active users in the military and industry, the company also replaced some high-strength plastic components in the Pulse with aluminum ones, allowing the hand to lift up to 220 pounds. <table class="clsTableCaption k-table" style="float: right;"> <tbody> <tr> <td><img src="https://opedge.com/Content/OldArticles/images/2010-04_02/04-02_07.jpg" alt="" /></td> </tr> <tr> <td><em>Photograph courtesy of RSLSteeper.</em></td> </tr> </tbody> </table> <strong>RSLSteeper</strong>, Rochester, Kent, England, is planning to release in May the bebionic range of upper-limb products. This includes a fully articulating hand with a variety of compliant grip patterns and, according to RSLSteeper Product Director Paul Steeper, "the world's first commercially available powered wrist joint that can flex, extend, and rotate." Prosthetists can customize the functions of the hand to suit individual user needs through the company's bespoke software, and according to pre-release statements, its highly realistic skin has a mesh backing that will help keep it functional even after it suffers minor tears. According to Steeper, bebionic will also fit into the existing U.S. Medicare reimbursement structure for myoelectric hands. "I think it's a really great time for amputees," said bebionic inventor Mark Hunter. "Things are changing finally from the hooks and claws. We're dragging prosthetics into the 21st century." </div>
<table class="clsTableCaption k-table" style="float: right;"> <tbody> <tr> <td><img src="https://opedge.com/Content/OldArticles/images/2010-04_02/04-02_01.jpg" alt="" /></td> </tr> <tr> <td>Robin af Ekenstam experiences a sense of touch in his phantom hand while using the sensor-packed SmartHand. <em>SmartHand photos courtesy of YourIs.com</em></td> </tr> </tbody> </table> <strong>November 2009:</strong> Cobbled onto Robin af Ekenstam's right arm is a crude-looking hand prototype bristling with exposed motors, loopy wiring, and aluminum fingers that appear to be capped with pencil erasers. As the young Swede takes the hand through a halting regimen of myoelectric-controlled movements, it emits a distinct mechatronic whine. The prototype's parts don't look particularly well tethered together, and the whole thing could well belong to a defunct Terminator whose skin had seen the wrong end of John Connor's flamethrower. However, this whirring assemblage is not only working in tuned harmony, it's coordinating with both an artificial-intelligence neural net and Ekenstam's own nervous system to give the Swedish amputee an experience he hasn't had in more than two years: feeling in his right hand. "When I grab something tightly, then I can feel it in the fingertips, which is strange because I don't have them anymore," Eckenstam told a BBC film crew in November 2009. "It's amazing." This ultramodern dance of seemingly disparate parts— electronic, biologic, and digital—is the result of a far larger act of cooperation, one that is helping to create a future in which prosthetics restore not only function and appearance, but also realistic sensation. <h2><span style="font-size: 14pt;">27 Countries, $93 Billion</span></h2> <strong>Today:</strong> The European Union (EU), whose funding agencies made Ekenstam's test experience with the prototype possible, is a research powerhouse. Though its scientists are spread out among 27 nations, speaking 23 different languages, they have joined forces in collaborative groups since 1984 to work under EU research initiatives called the Framework Programs (FPs). These five-year funding pushes proffer the international scientific community goals for research and provide the funds to carry it out. The 6th Framework Program (FP6) began in 2002 and ended in 2006, with collaborating teams using more than $23 billion in FP6 funds. From FP7 onward, the programs will last seven years each, so FP7 began in 2007 and will end in 2013. More than $70 billion in research money is budgeted for cooperative research during that period. This article explores just a few of the fruits of this cooperation—projects in various stages whose proponents came together from around the continent to develop vanguard capacities for prosthetic sensitivity and control. <h2><span style="font-size: 14pt;">Why Pursue a Prosthetic Sense of Touch?</span></h2> A team led by Maria Carrozza, PhD, stated in a 2009 IEEE annual conference presentation, "The design and development of a prosthetic artificial hand should aim as much as possible at replacing both functionality and cosmetic appearance of the natural hand lost by the amputee. Surveys on using commercial prosthetic hands reveal that 30 to 50 percent of upper-extremity amputees do not use their prosthetic hand regularly. The main factors for this are low functionality and controllability, poor cosmetic appearance, and an unnatural control system, which make the hand to be felt as an external device that is not part of the subject's body scheme." According to research by a cohort of EU scientists including Alan M. Wing, professor of human movement at the University of Birmingham, England, sense of touch influences not only our sense of ownership of our own body parts but also a yet uncounted number of fundamental physical functions, including balance and the process of moving in harmony with our physical surroundings. Might wearers be more likely to benefit from an upper-limb prosthesis if their device replaced a fuller complement of functions, including functional movement and functional sensations, such as pressure and heat, but also sensuality—the pleasure of physical touch? FP6 and FP7 researchers across Europe have been working to find out. <h2><span style="font-size: 14pt;">CyberHand</span></h2> <table class="clsTableCaption k-table" style="float: right;"> <tbody> <tr> <td><img src="https://opedge.com/Content/OldArticles/images/2010-04_02/04-02_02.jpg" alt="" /></td> </tr> <tr> <td>The Prensilia open hand-platform is available to researchers custom-built. <em>Photograph courtesy of Prensilia.</em></td> </tr> </tbody> </table> Many of the ventures toward this knowledge derive from the EU-funded CyberHand project. The CyberHand is a fully articulated, nonwearable, experimental hand prototype. Carrozza, a CyberHand project researcher, described it in her September 2009 IEEE presentation by saying, "The CyberHand, developed by ARTS Lab, Scuola Superiore Sant'Anna, Italy, can be defined as a stand-alone prosthetic-hand open platform. It is composed of five underactuated anthropomorphic fingers...which are actuated by six DC motors. Five of them are employed for the flexion/extension of the fingers, and a further one drives the thumb opposition and is housed inside the palm. Since each finger is composed of three phalanxes, the mechanical architecture presents 16 DOF [degrees of freedom] actuated by six degrees of motion (DOM).... When the hand closes, it is able to wrap over objects, thus obtaining grasps with a high number of contact points." Multiple hand projects based on CyberHand technology are in progress worldwide, and in 2010, prototypes based on the CyberHand became available for general sale to researchers via an Italian company called Prensilia. Founded by ARTS Lab researcher Christian Cipriani, PhD, Prensilia offers custom-built, self-contained, nonwearable prosthetic hands and open hand-platforms ranging in price from about $30,000 to $47,000. <h2><span style="font-size: 14pt;">SmartHand</span></h2> <table class="clsTableCaption k-table" style="float: right;"> <tbody> <tr> <td><img src="https://opedge.com/Content/OldArticles/images/2010-04_02/04-02_03.jpg" alt="" /></td> </tr> <tr> <td>The SmartHand.</td> </tr> </tbody> </table> One of the CyberHand's progeny is the SmartHand, Robin af Ekenstam's test hand. The SmartHand project is coordinated by Assistant Professor Fredrik Sebelius of Lund University, Sweden, and involves practitioners from a variety of fields and five nations. SmartHand has evolved away from the CyberHand platform both by becoming a wearable early prototype hand and by including a bidirectional interface. The SmartHand interface's sensing process is different from targeted muscle reinnervation (TMR) as developed by Todd Kuiken, MD, Sebelius notes. TMR requires the surgical relocation of nerves to the chest, while the SmartHand's interface is currently non-invasive and has only been tested on transradial amputees' existing nerve terminations along the residual forearm. In the SmartHand's current interface, 16 myoelectric sensors pick up the wearer's intentions from micromovements of the forearm muscles and drive the hand. Five servo motors, one for each finger, provide feedback to the wearer by pushing on the forearm's skin at the sites of nerve endings that report to the brain's "map" of the phantom hand. "In the brain, you have the sensory cortex," Sebelius explains, "and the sensory cortex area for the hand is neatly organized. The little finger is on top; the ring, middle, and index fingers are on the bottom; and then you have the thumb in the middle.... The concept is that by stimulating this map with the correct sensation through the arm, the subject will hopefully get sensation that feels intuitive. It should feel like you're actually touching a real finger. "The SmartHand feeds the sensation back onto the forearm by the sensory feedback system. You have force sensors and tension sensors in the hand, so if you are touching something, you get pressure indication in the robotic hand and that is then fed back to the servo motors on the forearm that are pushing down onto the skin proportionally. A small push on the Smart-Hand's index finger relates to a small push somewhere on the forearm." This transfers the information through the nerve and up to the brain as if the touch were happening on the amputated hand itself. SmartHand testers like Ekenstam have reported that they can literally feel with the SmartHand. However, sensation without control does little good. To train both the sensory and control processes, Sebelius' team used a data glove—the Cyberglove by Virtual Technologies, Palo Alto, California—to map electromyography (EMG) signals to hand movements. The Cyberglove is worn on the tester's healthy hand, where the glove's finger-joint angle sensors read its movement and help map it to EMG signals being emitted at the same time by the nerves in the residual limb. According to a November 2009 <em>IEEE Transactions On Biomedical Engineering</em> paper by Cipriani et al., the wearer performs synchronous movements with the healthy and phantom hand while software maps the nerve signals generated by the movement and learns to predict future hand movements based on incoming EMG signals. <h2><span style="font-size: 14pt;">LifeHand</span></h2> The LifeHand, a non-FP Italian project, is also based on the CyberHand platform. In 2009, an amputee volunteer agreed to have a series of hair-thin temporary electrodes implanted into the nerves of his residual forearm for four weeks, during which time he reportedly learned to control the LifeHand by thought alone. No one at the LifeHand project, based out of Campus Bio-Medico University (UNICAMPUS) of Rome, responded to requests to contribute to this article. <h2><span style="font-size: 14pt;">NanoBioTact and NanoBioTouch</span></h2> NanoBioTact is an FP6 project with two complementary purposes: to "analyze and understand the working principles of the natural sense of touch," and "to take inspiration from this knowledge to design novel and better tactile artificial sensory systems and robotics," according to Mike Adams, FREng, NanoBioTact project coordinator and professor at the University of Birmingham's School of Chemical Engineering, England. The NanoBioTact project brought together researchers in eight different fields—neurophysiology, psychology, information processing, numerical simulation, skin mechanics/tribology, nanotechnology, tissue engineering, and robotics—to complete these two goals. For the first, team members used electroencephalography (EEG) and microneurography (MRI) to measure afferent neural signals when volunteers rubbed a fingertip over textured surfaces. For the second, the team developed an artificial finger pad that incorporated microelectrical mechanical systems (MEMS) sensors. The finger pad was meant to replicate the functions of a natural fingertip during the touch process. <table class="clsTableCaption k-table" style="float: right;"> <tbody> <tr> <td><img src="https://opedge.com/Content/OldArticles/images/2010-04_02/04-02_05.jpg" alt="" /></td> </tr> <tr> <td>This tactile stimulator was built at ARTS Lab, Scuola Superiore Sant'Anna, Pisa, Italy, and used on human subjects at the University of Göteborg, Göteborg, Sweden, as part of NanoBioTact. Scientists record the neural responses of human fingertip receptors when a volunteer's fingertip is stimulated with different surfaces in order to identify the peripheral neural response to surface texture.</td> </tr> </tbody> </table> "In current technology in robotics and prosthetics, you might have force sensors so that you don't crush a coffee cup or whatever, but you still can't actually feel the topography of the surface—whether it's smooth or rough, wet or dry," Adams says. To develop that capacity, the NanoBioTact team needed to gather knowledge about how the body comprehends external texture, and the team assembled five of what it calls "work packages" through which to explore the question. The first work package involved understanding the tribology —the science and technology of interacting surfaces in relative motion—of the finger pad. "The finger pad has fingerprint ridges on the surface," Adams says, "and a large number of sweat pores. The skin's friction varies depending on its moisture content...so when you slide the finger pad on a smooth surface, it's in an occluded state—moisture is being transported from the skin that can't escape, and over a period of ten or 20 seconds, the coefficient of friction is increasing. The behavior can be rather different for rough surfaces. That concept is part of understanding the interaction between the real world and the human body. It starts with that interaction between the finger pad and a surface." The second work package explored information processing—using EEG leads to connect a volunteer's nervous system to an artificial-intelligence neural network while the volunteer touched various surfaces with the fingertip. The EEG signals were then processed so that the neural net "learned" which surfaces produced which nerve signals. The neural net then learned to recognize similar data delivered from an electronic sensor. The third work package explored microneurography—using a needle sensor invasively to measures the electrical response of particular mechanoreceptors while the volunteer slid his or her finger pad over different surfaces, then measuring the way in which the EEG-detected action potentials changed. The fourth work package involved developing a computational model of the finger pad. "It's a scale model in the sense that the fingerprint really is modeled in great detail, as is its overall deformation," Adams says. "There's an array of four sensors packaged in a rubber matrix or tissue-engineered skin. Each sensor has four transducers, which are a bit like a cross with a post sticking up. Each of the posts has a piezoresistive device on it. <em>(Editor's note: Piezoresistance, according to the American Institute of Physics, is a phenomenon "in which a resistance change, prompted by a change in another physical parameter, can be used in making sensitive sensors.")</em> If something touches the post, you can get information.... The model finger pad is slid over different surfaces using a variety of measuring devices, and signals from those MEMS devices can be used to interpret the nature of the surface over which they're sliding." <table class="clsTableCaption k-table" style="float: right;"> <tbody> <tr> <td><img src="https://opedge.com/Content/OldArticles/images/2010-04_02/04-02_04.jpg" alt="" /></td> </tr> <tr> <td>The NanoBioTact artificial tactile fingertip built at ARTS Lab, Scuola Superiore Sant'Anna, Pisa, Italy, includes a tactile sensor array, and it can identify different surfaces by their varying roughness. <em>NanoBioTact photographs courtesy of ARTS Lab, Scuola Superiore Sant'Anna.</em></td> </tr> </tbody> </table> The next stage of the project is part of the FP7 initiative. It's called NanoBioTouch, and it also requires two tasks. The first is to grow a living biological sensor onto a MEMS device, Adams says. Creating the nutrient-support systems for such a sensor is long in the future, but if it works, a biological sensor "should give us a more interesting range of responses and be closer to the natural system." "The second area of study is what's called affective touch, which is defined as the assessment of pleasantness or the preference for a particular tactile experience," Adams continues. "In NanoBioTact, we were telling whether something was rougher or smoother than something else. In NanoBioTouch, with affective touch, we'll be assessing if something is pleasant or unpleasant, or if you prefer one thing over another." Spending millions of dollars to develop a prosthesis that can restore the capacity to perceive subjective preferences might seem less valuable than some other research priorities. However, in <em>The Spell of the Sensuous,</em> David Abrams, PhD, asserts the value of physical sensitivity for experiencing the universe: "A sensing body's actions are never wholly determinate, since they must ceaselessly adjust themselves to a world and a terrain that is itself continually shifting. [Without the senses] all of its experiences and all its responses would already have been...programmed, as if into a machine. But could we then even call them experiences? For is not experience, or more precisely, perception, the constant thwarting of such closure?" <em>Morgan Stanfield can be reached at </em> <div style="background: #EFEFEF; padding: 10px 10px 5px 10px;"> <h2><span style="font-size: 14pt;">Coming Soon to a Residual Limb Near You</span></h2> <table class="clsTableCaption k-table" style="float: right;"> <tbody> <tr> <td><img src="https://opedge.com/Content/OldArticles/images/2010-04_02/04-02_06.jpg" alt="" /></td> </tr> <tr> <td><em>Photograph courtesy of Touch Bionics.</em></td> </tr> </tbody> </table> Universities aren't the only groups in Europe working on innovative projects. In the commercial world, <strong>Touch Bionics</strong>, Livingston, Scotland, is preparing to release in June the i-LIMB Pulse, the latest version of its i-LIMB Hand. According to Phil Newman, Touch Bionics' marketing director, the Pulse was "completely redesigned from the original i-LIMB." Bluetooth connectivity can link the Pulse to a PC-based software package called BioSim; prosthetists can use the software to program default grips into the hand according to the user's preference and to activate the "pulse" option. This option allows users to increase the strength of a grip through successively timed signals until the desired intensity is achieved. In an effort to serve more active users in the military and industry, the company also replaced some high-strength plastic components in the Pulse with aluminum ones, allowing the hand to lift up to 220 pounds. <table class="clsTableCaption k-table" style="float: right;"> <tbody> <tr> <td><img src="https://opedge.com/Content/OldArticles/images/2010-04_02/04-02_07.jpg" alt="" /></td> </tr> <tr> <td><em>Photograph courtesy of RSLSteeper.</em></td> </tr> </tbody> </table> <strong>RSLSteeper</strong>, Rochester, Kent, England, is planning to release in May the bebionic range of upper-limb products. This includes a fully articulating hand with a variety of compliant grip patterns and, according to RSLSteeper Product Director Paul Steeper, "the world's first commercially available powered wrist joint that can flex, extend, and rotate." Prosthetists can customize the functions of the hand to suit individual user needs through the company's bespoke software, and according to pre-release statements, its highly realistic skin has a mesh backing that will help keep it functional even after it suffers minor tears. According to Steeper, bebionic will also fit into the existing U.S. Medicare reimbursement structure for myoelectric hands. "I think it's a really great time for amputees," said bebionic inventor Mark Hunter. "Things are changing finally from the hooks and claws. We're dragging prosthetics into the 21st century." </div>