Engineers at the University of Washington (UW) have developed 3D-printed gears that can track and store information without using batteries or electronics. Instead, the system uses a method called backscatter, through which a device, including a prosthesis, can share information by reflecting signals that are transmitted to it with an antenna. The UW team presented its findings October 15 at the ACM Symposium on User Interface Software and Technology in Berlin.
The team of engineers previously developed the first 3D-printed object that could connect to Wi-Fi without electronics. The plastic device measured the liquid in a detergent bottle and could order more online when it ran low. But the system tracked movement in only one direction. The latest development monitors bidirectional motion, like the movement of a prosthetic hand or the opening and closing of a pill bottle.
“Last time, we had a gear that turned in one direction. As liquid flowed through the gear, it would push a switch down to contact the antenna,” said lead author Vikram Iyer, a doctoral student in the university’s Department of Electrical & Computer Engineering. “This time we have two antennas, one on top and one on bottom, that can be contacted by a switch attached to a gear. So, opening a pill bottle cap moves the gear in one direction, which pushes the switch to contact one of the two antennas. And then closing the pill bottle cap turns the gear in the opposite direction, and the switch hits the other antenna.”
Movement is captured when the switch contacts one of the two antennas, and the teeth of each gear carry specific messages.
“The gear’s teeth have a specific sequencing that encodes a message. It’s like Morse code,” said co-author Justin Chan, a doctoral student in the Allen School. “So, when you turn the cap in one direction, you see the message going forward. But when you turn the cap in the other direction, you get a reverse message.”
The team also printed an e-NABLE prosthetic hand with a prototype of their bidirectional sensor that monitors the hand opening and closing by determining the angle of the wrist.
“This system will give us a higher-fidelity picture of what is going on,” said co-author Jennifer Mankoff, PhD, a professor in the School of Computer Science & Engineering. “For example, right now we don’t have a way of tracking if and how people are using e-NABLE hands. Ultimately what I’d like to do with these data is predict whether or not people are going to abandon a device based on how they’re using it.”
The next step in the research is to reduce the size for more real-world testing, Mankoff said.
Editor’s note: This story was adapted from materials provided by the University of Washington.