Believing in Overcoming Cognitive Biases 2020
Believing in Overcoming Cognitive Biases 2020
Believing in Overcoming Cognitive Biases 2020
MEDICAL EDUCATION
Believing in Overcoming Cognitive Biases
Tiffany S. Doherty, PhD and Aaron E. Carroll, MD, MS
Abstract
Like all humans, health professionals are subject to cognitive biases that
can render diagnoses and treatment decisions vulnerable to error.
Learning effective debiasing strategies and cultivating awareness of
confirmation, anchoring, and outcomes biases and the affect heuristic,
among others, and their effects on clinical decision making should be
prioritized in all stages of education.
Introduction
Cognitive biases contribute significantly to diagnostic and treatment errors.1,2 A 2016
review of their roles in decision making lists 4 domains of concern for physicians:
gathering and interpreting evidence, taking action, and evaluating decisions.3 Although
experts have identified many different types of cognitive biases, specific examples from
these domains include confirmation bias, anchoring bias, the affect heuristic, and
outcomes bias. In this article, we first discuss these biases, how they affect medical
decision making, and how cognitive psychology helps to inform effective debiasing
strategies. We then discuss specific debiasing strategies and how to integrate them into
education.
Anchoring bias is closely related to confirmation bias and comes into play when
interpreting evidence. It refers to physicians’ practices of prioritizing information and
data that support their initial impressions, even when first impressions are wrong. It
often manifests when the first piece of information given to a physician is relied upon
too heavily when making decisions.3 For example, a patient’s back pain might be
attributed to known osteoporosis without ruling out other potential causes.
Further down the treatment pathway, outcomes bias can come into play. This bias refers
to the practice of believing that good or bad results are always attributable to prior
decisions, even when there is no valid reason to do so.3 Feedback on clinical decisions
is critical for identifying weaknesses or potential mistakes, so this type of bias can
prevent clinicians from taking into account appropriate feedback to improve future
performance. Although the relation between decisions and outcomes might seem
intuitive, the outcome of a decision cannot be the sole determinant of its quality; that is,
sometimes a good outcome can happen despite a poor clinical decision, and vice versa.
We more commonly use intuitive thinking strategies because they are fast and
reasonably effective. For example, intuitive thinking would likely lead to a flu diagnosis
for a patient presenting with fever, fatigue, and joint pain during winter months.
However, compared with analytical thinking strategies, intuitive strategies are much
more prone to error. For example, jumping to a diagnosis of influenza might cause one
to neglect to investigate other diagnoses for that patient (eg, meningococcal meningitis)
because it’s flu season. Intuitive strategies benefit from experience and are necessary in
situations in which time and information are lacking (eg, in emergency rooms). These
strategies rely on heuristics, or mental shortcuts that are generally sufficient, but not
guaranteed, to lead to the right answer. In contrast, analytical strategies require more
time and resources but allow the use of deductive logic to reach a diagnostic or
treatment decision that is less subject to external factors (eg, previous experience, test
availability).9 Effective debiasing strategies mainly involve a deliberate switch between
these 2 types of thinking.
774 www.journalofethics.org
Potential Debiasing Strategies
Pat Croskerry, an expert in clinical decision making, suggests that 3 things must occur in
order for improvement in bias-related diagnostic and treatment errors to happen: (1)
physicians must fully appreciate the contribution of cognitive biases to errors in medical
decision making, (2) they must recognize that such errors are not inevitable, and (3)
they must be optimistic that solutions to reduce bias work.1
Simply increasing physicians’ familiarity with the many types of cognitive biases—and
how to avoid them—may be one of the best strategies to decrease bias-related errors.1
Thus, education for medical students, residents, and fellows could fruitfully invest in
training on cognitive biases, the role they play in diagnostic and treatment errors, and
effective debiasing strategies. Two such strategies will be discussed below.
The practice of reflection reinforces behaviors that reduce bias in complex situations. A
2016 systematic review of cognitive intervention studies found that guided reflection
interventions were associated with the most consistent success in improving diagnostic
reasoning.13 A guided reflection intervention involves searching for and being open to
alternative diagnoses and willingness to engage in thoughtful and effortful reasoning
and reflection on one’s own conclusions, all with supportive feedback or challenge from
a mentor.14
The same review suggests that cognitive forcing strategies may also have some success
in improving diagnostic outcomes.13,15 These strategies involve conscious consideration
of alternative diagnoses other than those that come intuitively. One example involves
reading radiographs in the emergency department. According to studies, a common
pitfall among inexperienced clinicians in such a situation is to call off the search once a
positive finding has been noticed, which often leads to other abnormalities (eg, second
fractures) being overlooked. Thus, the forcing strategy in this situation would be to
continue a search even after an initial fracture has been detected.15
While some data suggest that cognitive forcing strategies are not successful in reducing
students’ diagnostic errors,16,17 a systematic review reveals that they can be efficacious
in specific circumstances (eg, telling participants to consider alternative diagnoses
rather than to be aware of misleading details).13 Overall, more research is needed to
understand how other factors (eg, study setting, participant experience or knowledge
level, bias or strategy introduction) influence cognitive forcing strategies’ effectiveness.
Using guided reflection and cognitive forcing strategies, medical trainees at all stages
can be taught to acknowledge the risk of potential biases during decision making and
then to deliberately counteract those potential biases. It is thought that, given time and
sustained practice, certain metacognitive strategies can become second nature to
physicians.15
Training formats such as workshops or seminars might also be effective formats. A 60-
minute workshop was conducted at the 2017 meeting of the Society for Academic
Emergency Medicine that consisted of brief instruction on cognitive biases and
debiasing strategies. The workshop significantly improved recognition of bias and
application of debiasing strategies.20 Although this intervention seems promising, future
studies should examine the effects of such workshops using measures less subjective
than self-assessment.
A seminar conducted at Wright State University with medical students and internal
medicine resident physicians focused on cognitive bias in medical decision making
using an objective method of assessment.21 There is evidence that participation in the
seminar improved scores on the Inventory of Cognitive Biases in Medicine (ICBM), an
instrument used to detect the impact of such biases on analytical thinking.22 It is
important to note that the validity of the ICBM has since been questioned.23 Reliable
measurement tools will be critical to implementing effective educational measures.
Conclusion
Cognitive biases in clinical practice have a significant impact on care, often in negative
ways. They sometimes manifest as physicians seeing what they want to see rather than
what is actually there. Or they come into play when physicians make snap decisions and
then prioritize evidence that supports their conclusions, as opposed to drawing
conclusions from evidence. Sometimes physicians’ previous experiences can lead them
astray. And, if outcomes are falsely attributed to decisions or actions, critical feedback
opportunities are lost and bad habits can become ingrained.
Fortunately, cognitive psychology provides insight into how to prevent biases. Guided
reflection and cognitive forcing strategies deflect bias through close examination of our
own thinking processes. Although more research is required, data suggest that these
strategies can be successful in the right circumstances. If they are to work, we must
consistently include them in medical curricula. During medical education and
consistently thereafter, we must provide physicians with a full appreciation of the cost of
biases and the potential benefits of combatting them.
776 www.journalofethics.org
References
1. Croskerry P. The importance of cognitive errors in diagnosis and strategies to
minimize them. Acad Med. 2003;78(8):775-780.
2. Stiegler MP, Neelankavil JP, Canales C, Dhillon A. Cognitive errors detected in
anaesthesiology: a literature review and pilot study. Br J Anaesth.
2012;108(2):229-235.
3. Molony DA. Cognitive bias and the creation and translation of evidence into
clinical practice. Adv Chronic Kidney Dis. 2016;23(6):346-350.
4. Nickerson RS. Confirmation bias: a ubiquitous phenomenon in many guises. Rev
Gen Psychol. 1998;2(2):175-220.
5. Satya-Murti S, Lockhart J. Recognizing and reducing cognitive bias in clinical and
forensic neurology. Neurol Clin Pract. 2015;5(5):389-396.
6. Croskerry P, Abbass AA, Wu AW. How doctors feel: affective issues in patients’
safety. Lancet. 2008;372(9645):1205-1206.
7. Pelaccia T, Tardif J, Triby E, Charlin B. An analysis of clinical reasoning through a
recent and comprehensive approach: the dual-process theory. Med Educ Online.
2011;16(1):5890.
8. Croskerry P, Stephen M, Cosby K, Wears R. Critical Thinking and Reasoning in
Emergency Medicine. Philadelphia, PA: Wolters Kluwer/Lippincott Williams &
Wilkins; 2008.
9. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and
theory of debiasing. BMJ Qual Saf. 2013;22(suppl 2):ii58-ii64.
10. Croskerry P. A universal model of diagnostic reasoning. Acad Med.
2009;84(8):1022-1028.
11. Lieberman MD, Jarcho JM, Satpute AB. Evidence-based and intuition-based self-
knowledge: an FMRI study. J Pers Soc Psychol. 2004;87(4):421-435.
12. Goel N, Rao H, Durmer JS, Dinges DF. Neurocognitive consequences of sleep
deprivation. Semin Neurol. 2009;29(4):320-339.
13. Lambe KA, O’Reilly G, Kelly BD, Curristan S. Dual-process cognitive interventions
to enhance diagnostic reasoning: a systematic review. BMJ Qual Saf.
2016;25(10):808-820.
14. Mamede S, Schmidt HG. The structure of reflective practice in medicine. Med
Educ. 2004;38(12):1302-1308.
15. Croskerry P. Cognitive forcing strategies in clinical decisionmaking. Ann Emerg
Med. 2003;41(1):110-120.
16. Sherbino J, Yip S, Dore KL, Siu E, Norman GR. The effectiveness of cognitive
forcing strategies to decrease diagnostic error: an exploratory study. Teach
Learn Med. 2011;23(1):78-84.
17. Sherbino J, Kulasegaram K, Howey E, Norman G. Ineffectiveness of cognitive
forcing strategies to reduce biases in diagnostic reasoning: a controlled trial.
CJEM. 2014;16(1):34-40.
18. Crowley RS, Legowski E, Medvedeva O, et al. Automated detection of heuristics
and biases among pathologists in a computer-based system. Adv Health Sci
Educ Theory Pract. 2013;18(3):343-363.
19. Bond WF, Deitrick LM, Arnold DC, et al. Using simulation to instruct emergency
medicine residents in cognitive forcing strategies. Acad Med. 2004;79(5):438-
446.
20. Daniel M, Carney M, Khandelwal S, et al. Cognitive debiasing strategies: a faculty
development workshop for clinical teachers in emergency medicine.
MedEdPORTAL. 2017;13:10646.
Aaron E. Carroll, MD, MS is a professor of Pediatrics and the associate Dean for
Research Mentoring at Indiana University School of Medicine in Indianapolis. He is also
the vice president for faculty development at the Regenstrief Institute. His primary
interests are health services research and pediatric outcomes.
Citation
AMA J Ethics. 2020;22(9):E773-778.
DOI
10.1001/amajethics.2020.773.
The viewpoints expressed in this article are those of the author(s) and do not
necessarily reflect the views and policies of the AMA.
778 www.journalofethics.org