Engineers from the University of Washington (UW) and UCLA have developed a flexible sensor “skin” that can be stretched over a prosthesis or part of a robot’s body to accurately convey information about shear forces and vibration critical to successfully grasping and manipulating objects.
The bio-inspired robot sensor skin, described in a paper published in the September issue of Sensors and Actuators A: Physical, mimics the way a human finger experiences tension and compression as it slides along a surface or distinguishes among different textures. It measures tactile information with similar precision and sensitivity as human skin.
“Robotic and prosthetic hands are based on visual cues right now such as, ‘Can I see my hand wrapped around this object?’ or ‘Is it touching this wire?’ That’s obviously incomplete information,” said Jonathan Posner, PhD, a UW professor of mechanical engineering and chemical engineering and the paper’s senior author. “If a robot is going to dismantle an improvised explosive device, it needs to know whether its hand is sliding along a wire or pulling on it. To hold on to a medical instrument, it needs to know if the object is slipping. This all requires the ability to sense shear force, which no other sensor skin has been able to do well.”
The new stretchable electronic skin, which was manufactured at the UW Washington Nanofabrication Facility, is made from the same silicone rubber used in swimming goggles. The rubber is embedded with tiny serpentine channels—roughly half the width of a human hair—filled with electrically conductive liquid metal that won’t crack or fatigue when the skin is stretched, as solid wires would do. When the skin is placed around a robot finger or end effector, these microfluidic channels are strategically placed on either side of where a human fingernail would be.
“Traditionally, tactile sensor designs have focused on sensing individual modalities: normal forces, shear forces, or vibration exclusively. However, dexterous manipulation is a dynamic process that requires a multimodal approach. The fact that our latest skin prototype incorporates all three modalities creates many new possibilities for machine learning-based approaches for advancing robot capabilities,” said Veronica Santos, PhD, a UCLA associate professor of mechanical and aerospace engineering and the paper’s co-author and robotics collaborator.
The research team from the UW College of Engineering and the UCLA Henry Samueli School of Engineering and Applied Science has demonstrated that the physically robust and chemically resistant sensor skin has a high level of precision and sensitivity for light touch applications—opening a door, interacting with a phone, shaking hands, picking up packages, and handling objects, among others. Recent experiments have shown that the skin can detect tiny vibrations at 800 times per second, better than human fingers.
“By mimicking human physiology in a flexible electronic skin, we have achieved a level of sensitivity and precision that’s consistent with human hands, which is an important breakthrough,” Posner said. “The sense of touch is critical for both prosthetic and robotic applications, and that’s what we’re ultimately creating.”
Editor’s note: This story was adapted from materials provided by University of Washington