"If there is a science to reduce diagnostic errors, it is still in its infancy, and we still have a long way to go." This is the adamant conclusion from an extensive Canadian study on the causes of errors in clinical reasoning, which reviewed cognitive biases, knowledge deficits, and dual thinking. Providing a generous doctrinal body, the researchers finalized by recognizing the role of reflection added to knowledge in refining the clinical expertise.
Madrid, March 19, 2019. Geoffrey Norman, professor emeritus of the Department of Clinical Epidemiology and Biostatistics of the McMaster University of Hamilton (Canada), published a paper in Academic Medicine, along with researchers from the universities of Washington (Seattle) and Erasmus (Rotterdam), which takes us away from the idea that most diagnostic errors are due to cognitive biases and can be minimized if the physician learns to recognize them.
This was an exhaustive review of studies on the dual process of clinical reasoning and the role of cognitive biases and heuristics versus knowledge deficits in diagnostic errors, and on the comparison of a wide range of educational strategies supported by the scientific literature to reduce these errors. In all, 38 of such biases have been described in medicine.
The authors of the article “The causes of errors in clinical reasoning: cognitive biases, knowledge deficits, and dual process thinking” emphasize that despite the general consensus that the dual thinking model is a valid representation of the theory of clinical reasoning, the causes of diagnostic errors remain unclear.
In reviewing the literature in psychology and medicine related to the models of dual thinking, the researchers attempted to identify whether the errors originate from the Type 1 (rapid and intuitive thinking) or Type 2 (slower, reflexive, or analytical thinking) components. The former could be related to associative memory, while the latter could be due to a limited capacity of working memory. Usually, clinical errors are blamed on the Type 1 component, and correction of that is attributed to action on the Type 2 component.
This idea has been described by Norman as a "simplistic view" because errors can arise from both types of thought processes. This study, which was based on the evidence that "more education and knowledge lead to a lower error rate," was unable to confirm that most errors are the result of cognitive biases or that they can be minimized by training physicians to detect them.
Warnings with little effectiveness
According to Norman's position, knowledge deficits contribute greatly to diagnostic errors, and didactic strategies aimed at reorganizing knowledge seem to have reduced, despite potential benefits with respect to diagnostic accuracy. The Canadian questioned the effectiveness of generic warnings for clinicians related to "acting slower, reflecting, or being careful and systematic," since they have little effect beyond slowing down the diagnostic process.
The study analyzes the three types of interventions available to reduce clinical reasoning errors – general, heuristics, and knowledge – and suggests that those who seek to reduce errors related to heuristics and cognitive biases lack solid evidence about their usefulness. However, some benefit has been observed in interventions that encourage clinicians to mobilize and reorganize their knowledge or to reflect on a case.
Following the indications of Nobel Prize winner Daniel Kahneman, from Princeton University, one of the simplest general strategies to reduce errors is to warn clinicians to be "careful and systematic, and to explore all alternatives." He indicates this in the sense of "going slower and asking for a reinforcement of the Type 2 (analytical) thinking model.”
After evaluating three studies applying this principle without great results, the group of Silvia Mamede, from the Erasmus University, obtained some success in increasing diagnostic accuracy in presented cases with the application of a small trick: simply telling the participants that the professors believed that those were "very difficult cases.” On the other hand, time pressure has been shown to be negative in another study in which beginning doctors were warned that "they were very late," which induced anxiety.
Regarding strategies based on heuristics to reduce the effect of biases, three educational interventions teaching the recognition of specific cognitive biases of diagnostic reasoning were reviewed and showed that medical residents can learn to define them, but there is little data on the relationship between such ability and the decrease of errors.
Adding reflection to knowledge
Regarding interventions to reduce knowledge-based errors, Norman's article highlights the deliberative reflection technique developed by Mamede and other colleagues to identify which evidence leads to an inconsistent diagnostic study. This implies returning to the original case, writing all its characteristics, and identifying them with the diagnostic process already underway. The final step is to rethink the hypothesis, but "a change in mentality if also justified.”
Others have contrasted the effects of reflection and direct reasoning on the diagnostic accuracy of simple versus complex cases and have noted the positive effect of reflection. Research cited in the article has shown that intervention on analytical thinking improved the resolution of simple cases among beginners, and was effective in the solution of complex cases among more experienced clinicians, i.e., the additional time for reflection increased the diagnostic accuracy among residents and experienced physicians.
From these studies, we conclude that reflection is quite beneficial, although the level of the students and the complexity of the cases affect its impact. This mitigation seems to be a consequence of the availability of adequate knowledge to solve the problem; to the extent that reflection is effective, it achieves these results by encouraging participants to identify and reconfigure their knowledge. Thus, this strategy is ineffective for clinicians who solve complex cases because they lack enough knowledge to solve the case from the start. Conversely, experts do not benefit from reflection to solve simple cases, because they can solve such problems easily from the start.
Norman G. R, Monteiro S. D, Sherbino J. et al. The causes of error in clinical reasoning: cognitive biases, knowledge deficits and dual process thinking. Academic Medicine. Jan 2017; 92 (1): 23 -28
Norman G. Dual processing and diagnostic errors. Adv Health Sci Educ Theory Pract. 2009;14 (suppl 1):37–49
Kahneman D. Thinking, Fast and Slow. New York, NY: MacMillan; 2011
Mamede S, Schmidt HG, Rikers RM, Penaforte JC, Coelho-Filho JM. Influence of perceived difficulty of cases on physicians’ diagnostic reasoning. Acad Med. 2008;83:1210–1216
Friedman CP, Gatti GG, Franz TM, et al. Do physicians know when their diagnoses are correct? Implications for decision support and error reduction. J Gen Intern Med. 2005;20:334–339
Dhaliwal G. Premature closure? Not so fast [published online March 15, 2016]. BMJ Qual Saf. doi: 10.1136/ bmjqs-2016-005267