The Role of Expertise in Clinical Decision-making: Is Experience Evidence?

Home > Articles > The Role of Expertise in Clinical Decision-making: Is Experience Evidence?
By John T. Brinkmann, MA, CPO/L, FAAOP(D)

It is common for healthcare students to expect clear answers to clinical questions and to become frustrated when experienced clinicians are unable to clearly articulate the rationale behind treatment decisions. Too often the answer clinicians provide is a form of "We've always done it this way." The use of experience and informal consensus, sometimes referred to as GOBSAT (good old boys sat around a table), is unsatisfying to novice clinicians who have not yet developed the knowledge and skill on which to base an intuitive decision-making process. It is also susceptible to outside influence and bias and is inadequate for justifying treatment decisions to payers, referral sources, and skeptical patients or caretakers.

Sarah Wieten, PhD, a researcher at the Center for Biomedical Ethics at Stanford University, reports "the goal of EBM [evidence-based medicine] was to put true knowledge in the place of unchallenged received wisdom based on power. Indeed, when asked for their motivation for creating EBM, many of the movement's early proponents pointed to incidents in their medical education in which they felt that their teachers were exercising expertise in this fashion (i.e., as an arbitrary abuse of power, rather than an expression of skilled familiarity…)."1

In a written interview completed just days before his death in 2015, David Sackett, OC, FRSC, widely acknowledged as one of the fathers of both modern clinical epidemiology and evidence-based practice (EBP), described experiencing disappointment during his medical education "from the unsatisfying justifications…from my seniors for their therapeutic decisions." (See sidebar below.) Sackett credits this disappointment with setting the course for his career, which included a key role in introducing the evidence-based paradigm for making medical decisions.2

In the early 1990s, EBP was offered as a way to mitigate the problems with the GOBSAT method for clinical decision-making. Various forms of evidence available to medical practitioners were ranked based on quality: Meta-analyses and systematic reviews were considered the highest quality evidence, and the opinion of clinicians were considered the lowest. It is important to note that, despite considering it as low quality, EBP in its earliest form included clinical experience as a form of evidence.3 In a 1996 article, Sackett et al. clarified their view of the meaning and role of clinical expertise and distinguished it from "external evidence":

The practice of evidence-based medicine means integrating individual clinical expertise with the best available external clinical evidence from systematic research. By individual clinical expertise we mean the proficiency and judgment that individual clinicians acquire through clinical experience and clinical practice. Increased expertise is reflected in many ways, but especially in more effective and efficient diagnosis and in the more thoughtful identification and compassionate use of individual patients' predicaments, rights, and preferences in making clinical decisions about their care…. Good doctors use both individual clinical expertise and the best available external evidence, and neither alone is enough. Without clinical expertise, practice risks becoming tyrannised by evidence, for even excellent external evidence may be inapplicable to or inappropriate for an individual patient.4

Critics of the way EBP has been implemented have expressed concerns that despite this initial inclusion of the role of expertise in the evidence hierarchy, too much weight is now given to research evidence at the expense of clinical judgement. In separate articles in a 2017 issue of the Journal of Clinical Epidemiology, two authors describe their concerns about how research evidence is developed. One author reports, "as EBM became more influential, it was also hijacked to serve agendas different from what it originally aimed for. Influential randomized trials are largely done by and for the benefit of the industry. Meta-analyses and guidelines have become a factory, mostly also serving vested interests. National and federal research funds are funneled almost exclusively to research with little relevance to health outcomes…. Under market pressure, clinical medicine has been transformed to finance-based medicine."5 Another author expressed concerns regarding the "considerable limitations" of EBP, "including overall reductionism and insufficient consideration of problems related to financial conflicts of interest. EBM does not represent the scientific approach to medicine: It is only a restrictive interpretation of the scientific approach to clinical practice. EBM drives the prescribing clinician to an overestimated consideration of potential benefits, paying little attention to the likelihood of responsiveness and to potential vulnerabilities in relations to the adverse effects of treatment. It is time to substitute the fashionable popularity of a strategy developed outside of clinical medicine with models and research based on the insights of clinical judgment and patient-doctor interaction…."6 

Ranking Evidence

In a 2012 conversation published on The Medical Roundtable website, Gordon Guyatt, MD, one of the graduate students who worked with Sackett in developing EBP, discussed his concerns regarding EBP and the role of clinical experience with Mark Tonelli, MD.7 While Guyatt sees some value in retaining a hierarchical scheme for evaluating evidence, Tonelli is less convinced. According to Tonelli, clinical research, pathophysiologic reasoning, and personal experience "are three different types of medical knowledge, not variations on the same thing…. Since each of the three types of medical knowledge is different in kind, they don't belong on a hierarchy. What clinicians are left with in caring for individual patients is trying to consider each of those types of knowledge and weighing them."7 The two agree, however, on the importance of clinical expertise. Guyatt: "…evidence by itself never tells you what to do. It's always evidence in the context of values and preferences."7 Tonelli: "…clinical research itself is never sufficient for clinical decision-making…. The individual circumstances of a case determine whether clinical research is applicable."7 Tonelli suggests a nonhierarchical arrangement of areas to consider when making clinical decisions.

Implementing EBP responsibly is even more challenging in a field with a relatively small foundation of research evidence to begin with and when building on that foundation is complicated by inherent difficulties in conducting randomized control trials in rehabilitation. O&P often lags behind other health professions in adopting contemporary practices, and we are likely to lag behind in recognizing the limitations of those practices. It is worthwhile to consider both the cautionary messages of those more experienced with EBP and the role that expert opinion plays in decision-making. 

What Is Expertise and Is It Evidence?

Wieten offers this description of expertise for consideration: "a ‘flow of skilled coping,' an unconscious state of movement from one competent activity to the next…" characterized by "un-reflective certainty about what needs doing next and a lack of dependence on explicit rules for functioning."1 Expert knowledge is developed "in the course of clinical interactions, in contrast with knowledge gained from sources such as journal articles…."1 Wieten points out that while expertise is identified in Sackett's original paper on EBM as a type of evidence (the lowest quality), Sackett's 1996 article identified expertise as something external to the evidence (Figure 1). This perspective is similar to other models of evidence that define expertise as "the skills needed to both interpret the evidence and apply it appropriately to the circumstances...."1 Guyatt, on the other hand, believes that "each clinician's clinical experience is a form of evidence, and the better documented that clinical experience, the more it would be appropriate to have confidence in it, and as we bring in safeguards against bias, then our confidence increases further….  [C]linical experience is a form of evidence and all clinical research is the systematic application of clinical observation with safeguards against bias."7

Krisit Malterud, MD, PhD, cautions that important information about a patient is neglected if only one type of knowledge is considered evidence. "If the clinical encounter is seen as constitutive of medicine, the specific knowledge generated from this encounter—the tacit rationality of clinical interaction—deserves epistemic status…. If we persist in viewing clinical practice and scientific knowledge as distinct and separable, we will obstruct the proper application of appropriate scientific inquiry to practice issues."8 Practitioners often perceive research evidence as unrelated to the challenges they encounter on the front lines of clinical care, and a quality ranking system that minimizes the value of individual expertise may contribute to both the perception and reality of irrelevancy. In a later article, Malterud points out that "clinical decisions and methods of patient care are based on much more than just the results of controlled experiments. Clinical knowledge consists of interpretive action and interaction—factors that involve communication, opinions, and experiences. The traditional quantitative research methods represent a confined access to clinical knowing, since they incorporate only questions and phenomena that can be controlled, measured, and counted. The tacit knowing of an experienced practitioner should also be investigated, shared, and contested."9 Malterud's point is not that research evidence is unimportant, but that expertise should be valued and rigorously investigated because it represents a crucial aspect of clinical decision-making. This investigation requires a different set of research methods and skills. "Numbers alone can never provide the whole range of evidence needed for clinical work—evidence here meaning more than the outcomes of randomized controlled trials. Hence, to systematize clinical knowledge for description and analysis should not be thought of as a blasphemous or unreasonable threat to the art of medicine. Qualitative research methods can help bridge the gaps between theory and practice in medicine, provided that standards of scientific rigour and quality are maintained."9 


The debate surrounding the strengths and weaknesses of EBP and the deeper issues of medical epistemology is far more robust than can be described in this brief article. John Paley, a strong advocate of retaining a distinction between evidence and expertise says that "identifying and eliminating (or reducing) error is the core, arguably, of scientific method…and research evidence is evidence that has passed this particular test." He clarifies that while this conception of evidence does not "place an arbitrary emphasis on quantification at the expense of qualitative data…it is not surprising, and it should not be controversial, that evidence that is the product of so-called quantitative research takes precedence."10 In a passage that seems to accurately describe many areas of O&P, he acknowledges that "…evidence-based decision making is clearly not feasible when there is no robust evidence. In such contexts, experienced clinical judgement is obviously a necessary resource, and can usually be relied upon to perform almost as well…."10

The experience and expertise of O&P clinicians is a credible source of knowledge. Out of necessity, it is often the defining factor in how treatment decisions are made. Experienced clinicians must be aware that "…opening the door to intuitive judgment also opens it to other human characteristics, such as fatiguability, habit proneness, self-serving motivations, ingroup loyalty, and authoritarian practice."11 We must develop the skills of reflection and communication necessary to provide clear descriptions of the rationale behind our techniques and treatment decisions. We owe this to our patients and the clinicians who are under our influence during their training. Novice clinicians require guidance in forming a foundation of knowledge and practical experience on which to develop their own intuitive expertise. "The expert sees intuitively what to do, without applying rules, and the artistic patterns of behavior are difficult to explicate. Yet the performance of the expert could not have been achieved unless the inarticulate rules had first been learnt."8 Constantly refining their practice and learning new techniques can give experienced clinicians added sensitivity to the needs of less-experienced clinicians. Explaining experience-based treatment rationales clearly subjects them to the scrutiny required for constant improvement and encourages novice clinicians to develop the skills necessary for good clinical judgement. 

John T. Brinkmann, MA, CPO/L, FAAOP(D), is an assistant professor at Northwestern University Prosthetics-Orthotics Center. He has more than 20 years of experience treating a wide variety of patients. 


1.        Wieten, S. 2018. Expertise in evidence-based medicine: a tale of three models. Philosophy, Ethics, and Humanities in Medicine 13(1):2.


3.        Evidence-Based Medicine Working Group. 1992. Evidence-based medicine. A new approach to teaching the practice of medicine. JAMA 268(17):2420.

4.        Sackett, D. L., W. M. Rosenberg, J. M. Gray, R. B. Haynes, and W. S. Richardson. 1996. Evidence based medicine: what it is and what it isn't. BMJ 312(7023):71-2.

5.        Ioannidis, J. P. 2016. Evidence-based medicine has been hijacked: A report to David Sackett. Journal of Clinical Epidemiology 73:82-6.

6.        Fava, G. A. 2017. Evidence-based medicine was bound to fail: A report to Alvan Feinstein. Journal of Clinical Epidemiology 84:3-7.

7.        The Medical Roundtable General Medicine Edition 1(1): 75-84.

8.        Malterud, K. 1995. The legitimacy of clinical knowledge: towards a medical epistemology embracing the art of medicine. Theoretical Medicine 16(2):183-98.

9.        Malterud, K. 2001. The art and science of clinical knowledge: evidence beyond measures and numbers. The Lancet 358(9279):397-400.

10.     Paley, J. 2006. Evidence and expertise. Nursing Inquiry (2):82-93.

11.     Hamm, R. M., and Z. J.Nagykaldi. 2018. Physician judgment and clinical practice guidelines. Journal of Cognitive Engineering and Decision Making (3):209-14.