<img class="aligncenter size-full wp-image-223789" src="https://opedge.com/wp-content/uploads/2021/10/2c.jpg" alt="" width="597" height="695" /> It was just a question, a brief comment among many others at a two-day meeting, but it stuck with me. I had helped organize the 2012 State-of-the-Science meeting on microprocessor-controlled prosthetic knees for the American Academy of Orthotists and Prosthetists. We convened a panel of subject matter experts to examine current understanding of the effectiveness of the relatively new technology and to tackle questions like whether any benefits to the user outweigh the higher costs. Halfway through the first day, one of our panelists, an orthopedic surgeon, posed the question I could not shake: “How do we know that users don’t walk differently with a microprocessor knee just because they know it’s expensive or complex?” As soon as I heard it, I had an internal moment of reckoning. “We don’t know,” I told myself and realized that much of my own research had ignored perhaps the most important part of the human equation: the brain. I’m a gait researcher, a biomedical engineer. I have a hyperfocus on variables like energy transfer and joint kinematics. For too long, I realized, that focus might have come at the expense of a full understanding of what is creating the movement I study. I consider the neuromechanics of gait as well, and concepts like the central pattern generator, but the surgeon’s comment made me realize I had been ignoring more esoteric factors such as mood, temperament, and potential biases. The question nagged at me until I decided it warranted some full research studies. Since then, my research team and I have studied the role of the brain in two settings. First, we addressed the question about the potential for bias when someone uses an advanced or expensive orthopedic device. Then, we studied a challenging bias associated with using gait analysis in research. In both cases, we took a fascinating dive into the way the brain guides and influences walking, and how it can be affected by cognitive biases. Cognitive biases are systematic errors that occur as people take in and interpret information. One example is the hot-hand fallacy, the belief that a person who has experienced a random success has a better chance of succeeding with subsequent attempts. This happened to me after I won a contest on the radio, and then a friend decided I was on a lucky streak and wanted me to buy a lottery ticket for him. The trouble is, the laws of probability say that radio contests and lotteries have completely independent odds, and there’s no such thing as a lucky streak. When the bias is in play, one part of our brain manages to override whatwe know and change our behavior.<img class="size-full wp-image-223787 alignright" src="https://opedge.com/wp-content/uploads/2021/10/2a.jpg" alt="" width="204" height="441" /> The issue with microprocessor-controlled knees—or any other advanced orthopedic devices—is whether users form predetermined expectations about their performance, and then behave differently as a result. The former is called expectation bias, and the latter is called confirmation bias. We tested both cognitive biases using an off-the-shelf hinged knee brace. We purchased two identical braces, but we modified one of them to make it look computerized. We painted the straps a shiny silver color and added a small switch, LED light circuit, and USB port. Then, we recruited 18 healthy young adult subjects. We told the subjects that a manufacturer had asked us to help it study a prototype of a new knee brace that used a microprocessor and accelerometer to learn their gait pattern and then “dynamically alter the stiffness in real time.” We even provided a fake promotional flyer that showed results of a recent pilot study on knee angles. The deception was approved by our ethics board, and we debriefed each subject afterward. Once we introduced the subjects to the concept of the “computerized” knee brace, we gave them a survey that asked their thoughts about the new brace versus the standard one. They provided ratings on appearance, stability, performance in sports, cost, and comfort, and then we asked them to predict their overall preference. The results of this initial survey confirmed expectation bias. Subjects favored the brace they thought was computerized in every category except for cost, with the strongest preference given for stabilization. Clearly, they had an expectation that the advanced technology would perform better. Then, it was finally time for the subjects to use the braces. We conducted a 3D-instrumented gait analysis with both braces in random order and measured standard outcomes like walking speed, step length, ground reaction forces, and knee joint angles. Despite the expressed preference for the “computerized” brace, the gait of our subjects in the two braces was virtually identical. There was not a single significant difference in any of our outcome measures. Average walking speed differed by only 1 mm/s, and peak contralateral knee flexion by only one tenth of one degree. Moreover, the braces had little effect on the knee, as we also had no bilateral significant differences. So, clearly, the expectation bias had no effect on gait. After the subjects had used both braces, we repeated our survey. Remarkably, even though they had just walked the same in both braces, the subjects now liked the “computerized” brace even more. Overall preference increased from 61 percent on a Likert scale for the “computerized” brace to 83 percent. Cost became less of a concern, as apparently subjects now thought the benefits they had just experienced outweighed the additional cost—even though objectively they had not experienced any measurable benefits. So, while confirmation bias did not affect user performance, it did affect user preference. An important limitation in this study is the subject population. These were healthy young adults who did not need a knee brace, so it would have been unlikely for them to change what was already an optimized gait pattern. It is unknown whether gait might change in a population that requires a knee orthosis, though we are working on studying that population now. Despite the limitation of that study, it was clear that we must be careful when considering user feedback and self-reported outcome measures, particularly when we deal with a technologically advanced, computerized, or even a more expensive component or device. Users might bring their own expectations, or they might feed off of language used by practitioners, and this might shape their perceived understanding of their own walking patterns. About the same time, I was continuinga line of research on children with idiopathic toe walking (ITW), and I realized I was dealing with another cognitive bias. ITW is a fascinating condition, with kids choosing to walk on tiptoe almost all the time for no apparent reason. In its early stages, children with ITW still have adequate range of motion to display standard heel-to-toe walking gait, but they prefer initial contact with the toes. Orthotic interventions can prevent the gait pattern, but in our research, we had been trying to figure out the underlying causes. Time after time, I would meet a family in the hall and walk them to the lab, observing their child’s characteristic tiptoe walking pattern. Then, once we attached our reflective markers and took our measurements in the lab, the child would adopt his or her non-preferred heel-to-toe gait. It became challenging to measure the children’s natural gait patterns because they were trying to “behave” and show the gait that parents and therapists had been encouraging. This is an instance of the Hawthorne effect, which happens when people alter their behavior when they know they are being observed. Another example is a 2006 study by Eckmanns et al. that saw handwashing compliance among medical staff increase by 55 percent when the staff members were told a study was being done on handwashing. Psychologists believe the Hawthorne effect occurs when people form beliefs about observers’ expectations and then alter their behavior to match those perceived expectations. The phenomenon is not uncommon in gait analysis, and it’s a big problem. When making clinical decisions or testing research hypotheses, it is essential not only to have accurate measurements, but also to measure natural, unmodified motion. In the past, researchers have tried cognitive distractors and dual-task paradigms so subjects will not focus on the lab and their own gait. For example, subjects might be asked to count backward by threes, or to carry an object. But these tasks play their own role in altering gait. We sought to confirm that the laboratory might cause the Hawthorne effect in pediatric gait analysis and test an intervention that might produce more natural gait. We recruited 31 typically developing children for the study, ages five to 13 years. Following the model of the first study, we first asked the children what their feelings were about the upcoming gait analysis using a set of five domains of child feelings according to Walden (2003): happy, excited, scared, sad, and angry. Then, we conducted our usual gait analysis using all our standard practices to serve as a baseline. Next, it was time for our intervention. The Idoneus Corporation makes a software program called Argonaut that interfaces with our Vicon motion capture system. It takes the real-time 3D marker coordinate data and adds a graphical overlay so that the children can see themselves moving as an avatar in a computer-generated environment. We let each child choose one of three settings—a castle court, a city park, or an alien planet—and then choose an avatar character. We displayed the character on monitors around the lab and gave each child free reign to move about the space, encountering virtual objects and other characters. We were still able to record 3D gait data while the participants moved around the lab, watching their avatars match their movements on the monitors. The children were very engaged with the program, showing frequent stops, starts, and changes in direction as they wandered around the space. After about ten minutes of the game intervention, we turned off the wall monitors and repeated the standard gait analysis, and then repeated the feelings inventory. At the end of the study, children were significantly happier, more excited, less scared, and less sad (none of the children expressed anger pre- or post-intervention). This suggests that steps can be taken to make children feel more comfortable about laboratory-based gait analysis. But did their gait change?<img class="size-full wp-image-223788 alignright" src="https://opedge.com/wp-content/uploads/2021/10/2b.jpg" alt="" width="402" height="388" /> We found significant differences in one key aspect of gait, and it was a bit surprising. Following the intervention, arm swing increased significantly. Subjects walked at similar speeds, but their arm swing—measured by the location of each hand in front of or behind the torso—was greater following the Argonaut play. Upon further investigation, we found previous studies showing that arm movement in walking has been correlated with feelings, and is even diminished in individuals with clinical depression, so perhaps the result was not so surprising. An important aspect of this result is that the gamification intervention had a carryover effect; the difference was noted during gait after the intervention was removed. This means the intervention has the potential to produce more natural gait without imposing its own alterations. By extension, this implies that children with movement disorders, such as the children with ITW, might display a more natural gait pattern in the lab if they are first given a movement-based game-playing opportunity. These studies, and the follow-ups we have planned, serve as a reminder to me that it’s easy to focus on movement kinematics and forget that they can be altered by the brain. And the results should make us all cautious about automatically attributing changes in gait entirely to changes in devices or components. O&P practitioners conduct their own versions of gait analysis all the time, and even though it might be in a simple room with parallel bars and not in an instrumented research lab, the same cognitive biases can still be present. Therefore, it is wise to consider active steps to reduce them. It might be a good idea to ask patients if they have an expectation that a particular device might perform in a particular way. It is also wise to be cautious about using language that could engender some of those expectations. And when analyzing gait, whether in the clinic or in the lab, we should remember that the environment can be intimidating and seek ways to foster natural gait. Even as my team and I continue our research on cognitive biases, these initial results are informing our other studies as well. In each case, we have come to understand that we cannot just focus on kinematic and kinetic quantification of gait; instead, we must recognize the entire individual who is producing that gait pattern, brain and all. <em> </em> <em>Mark Geil, PhD, is a professor and interim associate dean for research and operations in the Wellstar College of Health and Human Services at Kennesaw State University (KSU), Georgia. He teaches clinical gait analysis for KSU’s Master of Science in Prosthetics and Orthotics program and is an honorary member, Thranhardt awardee, and research award winner in the American Academy of Orthotists and Prosthetists.</em> <em><strong>Referenced Studies</strong></em> <em>Balsamo, B., M. D. Geil, R. Ellis, and J. Wu. 2018. Confirmation bias affects user perception of knee braces. Journal of Biomechanics 75:164-70.</em> <em>Geil, M. D., L. Rahnama, E. Sergeant, K. Soulis, J. Jarrells, and M. Poisal. 2021. Influence of non-immersive avatar-based gamification on the Hawthorne Effect in pediatric gait. Gait and Posture 88:122-5.</em> <em><strong>Other Articles Mentioned</strong></em> <em>Walden, T. A., et al. 2003. How I feel: a self-report measure of emotional arousal and regulation for children. Psychological Assessment 15(3):399-412.</em> <em>Eckmanns, T., et al. 2006. Compliance with antiseptic hand rub use in intensive care units: the Hawthorne effect. Infection Control &Hospital Epidemiology 27(9):931-4.</em>
<img class="aligncenter size-full wp-image-223789" src="https://opedge.com/wp-content/uploads/2021/10/2c.jpg" alt="" width="597" height="695" /> It was just a question, a brief comment among many others at a two-day meeting, but it stuck with me. I had helped organize the 2012 State-of-the-Science meeting on microprocessor-controlled prosthetic knees for the American Academy of Orthotists and Prosthetists. We convened a panel of subject matter experts to examine current understanding of the effectiveness of the relatively new technology and to tackle questions like whether any benefits to the user outweigh the higher costs. Halfway through the first day, one of our panelists, an orthopedic surgeon, posed the question I could not shake: “How do we know that users don’t walk differently with a microprocessor knee just because they know it’s expensive or complex?” As soon as I heard it, I had an internal moment of reckoning. “We don’t know,” I told myself and realized that much of my own research had ignored perhaps the most important part of the human equation: the brain. I’m a gait researcher, a biomedical engineer. I have a hyperfocus on variables like energy transfer and joint kinematics. For too long, I realized, that focus might have come at the expense of a full understanding of what is creating the movement I study. I consider the neuromechanics of gait as well, and concepts like the central pattern generator, but the surgeon’s comment made me realize I had been ignoring more esoteric factors such as mood, temperament, and potential biases. The question nagged at me until I decided it warranted some full research studies. Since then, my research team and I have studied the role of the brain in two settings. First, we addressed the question about the potential for bias when someone uses an advanced or expensive orthopedic device. Then, we studied a challenging bias associated with using gait analysis in research. In both cases, we took a fascinating dive into the way the brain guides and influences walking, and how it can be affected by cognitive biases. Cognitive biases are systematic errors that occur as people take in and interpret information. One example is the hot-hand fallacy, the belief that a person who has experienced a random success has a better chance of succeeding with subsequent attempts. This happened to me after I won a contest on the radio, and then a friend decided I was on a lucky streak and wanted me to buy a lottery ticket for him. The trouble is, the laws of probability say that radio contests and lotteries have completely independent odds, and there’s no such thing as a lucky streak. When the bias is in play, one part of our brain manages to override whatwe know and change our behavior.<img class="size-full wp-image-223787 alignright" src="https://opedge.com/wp-content/uploads/2021/10/2a.jpg" alt="" width="204" height="441" /> The issue with microprocessor-controlled knees—or any other advanced orthopedic devices—is whether users form predetermined expectations about their performance, and then behave differently as a result. The former is called expectation bias, and the latter is called confirmation bias. We tested both cognitive biases using an off-the-shelf hinged knee brace. We purchased two identical braces, but we modified one of them to make it look computerized. We painted the straps a shiny silver color and added a small switch, LED light circuit, and USB port. Then, we recruited 18 healthy young adult subjects. We told the subjects that a manufacturer had asked us to help it study a prototype of a new knee brace that used a microprocessor and accelerometer to learn their gait pattern and then “dynamically alter the stiffness in real time.” We even provided a fake promotional flyer that showed results of a recent pilot study on knee angles. The deception was approved by our ethics board, and we debriefed each subject afterward. Once we introduced the subjects to the concept of the “computerized” knee brace, we gave them a survey that asked their thoughts about the new brace versus the standard one. They provided ratings on appearance, stability, performance in sports, cost, and comfort, and then we asked them to predict their overall preference. The results of this initial survey confirmed expectation bias. Subjects favored the brace they thought was computerized in every category except for cost, with the strongest preference given for stabilization. Clearly, they had an expectation that the advanced technology would perform better. Then, it was finally time for the subjects to use the braces. We conducted a 3D-instrumented gait analysis with both braces in random order and measured standard outcomes like walking speed, step length, ground reaction forces, and knee joint angles. Despite the expressed preference for the “computerized” brace, the gait of our subjects in the two braces was virtually identical. There was not a single significant difference in any of our outcome measures. Average walking speed differed by only 1 mm/s, and peak contralateral knee flexion by only one tenth of one degree. Moreover, the braces had little effect on the knee, as we also had no bilateral significant differences. So, clearly, the expectation bias had no effect on gait. After the subjects had used both braces, we repeated our survey. Remarkably, even though they had just walked the same in both braces, the subjects now liked the “computerized” brace even more. Overall preference increased from 61 percent on a Likert scale for the “computerized” brace to 83 percent. Cost became less of a concern, as apparently subjects now thought the benefits they had just experienced outweighed the additional cost—even though objectively they had not experienced any measurable benefits. So, while confirmation bias did not affect user performance, it did affect user preference. An important limitation in this study is the subject population. These were healthy young adults who did not need a knee brace, so it would have been unlikely for them to change what was already an optimized gait pattern. It is unknown whether gait might change in a population that requires a knee orthosis, though we are working on studying that population now. Despite the limitation of that study, it was clear that we must be careful when considering user feedback and self-reported outcome measures, particularly when we deal with a technologically advanced, computerized, or even a more expensive component or device. Users might bring their own expectations, or they might feed off of language used by practitioners, and this might shape their perceived understanding of their own walking patterns. About the same time, I was continuinga line of research on children with idiopathic toe walking (ITW), and I realized I was dealing with another cognitive bias. ITW is a fascinating condition, with kids choosing to walk on tiptoe almost all the time for no apparent reason. In its early stages, children with ITW still have adequate range of motion to display standard heel-to-toe walking gait, but they prefer initial contact with the toes. Orthotic interventions can prevent the gait pattern, but in our research, we had been trying to figure out the underlying causes. Time after time, I would meet a family in the hall and walk them to the lab, observing their child’s characteristic tiptoe walking pattern. Then, once we attached our reflective markers and took our measurements in the lab, the child would adopt his or her non-preferred heel-to-toe gait. It became challenging to measure the children’s natural gait patterns because they were trying to “behave” and show the gait that parents and therapists had been encouraging. This is an instance of the Hawthorne effect, which happens when people alter their behavior when they know they are being observed. Another example is a 2006 study by Eckmanns et al. that saw handwashing compliance among medical staff increase by 55 percent when the staff members were told a study was being done on handwashing. Psychologists believe the Hawthorne effect occurs when people form beliefs about observers’ expectations and then alter their behavior to match those perceived expectations. The phenomenon is not uncommon in gait analysis, and it’s a big problem. When making clinical decisions or testing research hypotheses, it is essential not only to have accurate measurements, but also to measure natural, unmodified motion. In the past, researchers have tried cognitive distractors and dual-task paradigms so subjects will not focus on the lab and their own gait. For example, subjects might be asked to count backward by threes, or to carry an object. But these tasks play their own role in altering gait. We sought to confirm that the laboratory might cause the Hawthorne effect in pediatric gait analysis and test an intervention that might produce more natural gait. We recruited 31 typically developing children for the study, ages five to 13 years. Following the model of the first study, we first asked the children what their feelings were about the upcoming gait analysis using a set of five domains of child feelings according to Walden (2003): happy, excited, scared, sad, and angry. Then, we conducted our usual gait analysis using all our standard practices to serve as a baseline. Next, it was time for our intervention. The Idoneus Corporation makes a software program called Argonaut that interfaces with our Vicon motion capture system. It takes the real-time 3D marker coordinate data and adds a graphical overlay so that the children can see themselves moving as an avatar in a computer-generated environment. We let each child choose one of three settings—a castle court, a city park, or an alien planet—and then choose an avatar character. We displayed the character on monitors around the lab and gave each child free reign to move about the space, encountering virtual objects and other characters. We were still able to record 3D gait data while the participants moved around the lab, watching their avatars match their movements on the monitors. The children were very engaged with the program, showing frequent stops, starts, and changes in direction as they wandered around the space. After about ten minutes of the game intervention, we turned off the wall monitors and repeated the standard gait analysis, and then repeated the feelings inventory. At the end of the study, children were significantly happier, more excited, less scared, and less sad (none of the children expressed anger pre- or post-intervention). This suggests that steps can be taken to make children feel more comfortable about laboratory-based gait analysis. But did their gait change?<img class="size-full wp-image-223788 alignright" src="https://opedge.com/wp-content/uploads/2021/10/2b.jpg" alt="" width="402" height="388" /> We found significant differences in one key aspect of gait, and it was a bit surprising. Following the intervention, arm swing increased significantly. Subjects walked at similar speeds, but their arm swing—measured by the location of each hand in front of or behind the torso—was greater following the Argonaut play. Upon further investigation, we found previous studies showing that arm movement in walking has been correlated with feelings, and is even diminished in individuals with clinical depression, so perhaps the result was not so surprising. An important aspect of this result is that the gamification intervention had a carryover effect; the difference was noted during gait after the intervention was removed. This means the intervention has the potential to produce more natural gait without imposing its own alterations. By extension, this implies that children with movement disorders, such as the children with ITW, might display a more natural gait pattern in the lab if they are first given a movement-based game-playing opportunity. These studies, and the follow-ups we have planned, serve as a reminder to me that it’s easy to focus on movement kinematics and forget that they can be altered by the brain. And the results should make us all cautious about automatically attributing changes in gait entirely to changes in devices or components. O&P practitioners conduct their own versions of gait analysis all the time, and even though it might be in a simple room with parallel bars and not in an instrumented research lab, the same cognitive biases can still be present. Therefore, it is wise to consider active steps to reduce them. It might be a good idea to ask patients if they have an expectation that a particular device might perform in a particular way. It is also wise to be cautious about using language that could engender some of those expectations. And when analyzing gait, whether in the clinic or in the lab, we should remember that the environment can be intimidating and seek ways to foster natural gait. Even as my team and I continue our research on cognitive biases, these initial results are informing our other studies as well. In each case, we have come to understand that we cannot just focus on kinematic and kinetic quantification of gait; instead, we must recognize the entire individual who is producing that gait pattern, brain and all. <em> </em> <em>Mark Geil, PhD, is a professor and interim associate dean for research and operations in the Wellstar College of Health and Human Services at Kennesaw State University (KSU), Georgia. He teaches clinical gait analysis for KSU’s Master of Science in Prosthetics and Orthotics program and is an honorary member, Thranhardt awardee, and research award winner in the American Academy of Orthotists and Prosthetists.</em> <em><strong>Referenced Studies</strong></em> <em>Balsamo, B., M. D. Geil, R. Ellis, and J. Wu. 2018. Confirmation bias affects user perception of knee braces. Journal of Biomechanics 75:164-70.</em> <em>Geil, M. D., L. Rahnama, E. Sergeant, K. Soulis, J. Jarrells, and M. Poisal. 2021. Influence of non-immersive avatar-based gamification on the Hawthorne Effect in pediatric gait. Gait and Posture 88:122-5.</em> <em><strong>Other Articles Mentioned</strong></em> <em>Walden, T. A., et al. 2003. How I feel: a self-report measure of emotional arousal and regulation for children. Psychological Assessment 15(3):399-412.</em> <em>Eckmanns, T., et al. 2006. Compliance with antiseptic hand rub use in intensive care units: the Hawthorne effect. Infection Control &Hospital Epidemiology 27(9):931-4.</em>