Back to Journals » Advances in Medical Education and Practice » Volume 9

Cognitive bias in clinical practice – nurturing healthy skepticism among medical students

Authors Bhatti A

Received 20 August 2017

Accepted for publication 21 December 2017

Published 10 April 2018 Volume 2018:9 Pages 235—237

DOI https://doi.org/10.2147/AMEP.S149558

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Md Anwarul Azim Majumder



Alysha Bhatti

Faculty of Medicine, Imperial College London, London, UK


Abstract: Errors in clinical reasoning, known as cognitive biases, are implicated in a significant proportion of diagnostic errors. Despite this knowledge, little emphasis is currently placed on teaching cognitive psychology in the undergraduate medical curriculum. Understanding the origin of these biases and their impact on clinical decision making helps stimulate reflective practice. This article outlines some of the common types of cognitive biases encountered in the clinical setting as well as cognitive debiasing strategies. Medical educators should nurture healthy skepticism among medical students by raising awareness of cognitive biases and equipping them with robust tools to circumvent such biases. This will enable tomorrow’s doctors to improve the quality of care delivered, thus optimizing patient outcomes.

Keywords: cognitive bias, diagnostic error, clinical decision making

Introduction

Diagnostic errors have been implicated in 15% of clinical decisions, and represent the largest thrust behind medicolegal claims.1 Importantly, misdiagnosis has the potential to incur serious harm to patients as it sets up a cascade of subsequent mistreatment that contravenes the ethical principle behind healthcare delivery.

While the causes of misdiagnosis are multifactorial, inherent errors in clinical reasoning, known as cognitive biases, have a pivotal role to play. A greater awareness and understanding of the impact of cognitive biases in clinical decision making is, therefore, likely to stimulate reflective practice and improve patient outcomes. The General Medical Council advocates that medical students “continually and systematically reflect on practice and, wherever necessary, translate that reflection into action.”2 Medical students must, thus, be equipped with skills of critical thinking, enabling them to explore their own cognitive biases as well as evaluate those of the doctors they observe during clinical placements.

Cognitive bias in clinical practice

Kahneman and Tversky proposed two tenets of clinical decision making: system 1 concerns rapid, intuitive judgments, while system 2 concerns slower, but more reasoned judgments because it rationalizes and deliberates. While the appropriate use of system 1 or system 2 thinking is situation dependent, at the fundamental level, system 1 lacks executive censorship, making it more susceptible to cognitive bias.3 This is problematic, as one study found complete or partial reliance on system 1 thinking among a cohort of early clinical learners.4

A myriad of cognitive biases have been described in the literature, some of which will be explored through evaluating common clinical scenarios. For example, a patient with a history of productive cough, fever, and pleuritic chest pain, seen after five patients with a similar history who were subsequently diagnosed with pneumonia, is also likely to be diagnosed with pneumonia because of the relative ease with which the diagnosis comes to mind. This is known as “availability bias”. By contrast, “gambler’s fallacy” describes the alternative diagnostic outcome whereby the doctor reasons that the current patient cannot also have pneumonia because of the recent spate of pneumonia diagnoses. The “gambler’s fallacy” highlights how humans are inherently poor at estimating statistical probabilities. “Confirmation bias” is evident when clinicians, having formed an opinion about a situation, favor evidence that supports this opinion and overlook or discount contradicting evidence. A common scenario concerns an obese patient with intermittent retrosternal pain who is dismissed as having dyspepsia, despite electrocardiographic evidence of an evolving myocardial infarction. This patient may subsequently re-present with more convincing symptoms of an acute coronary syndrome and a characteristic electrocardiographic trace; however, the delay in diagnosis may have already caused undue harm. Tackling this situation may be further hampered by a type of cognitive bias known as “anchoring” whereby the ability to look at a situation with fresh eyes becomes difficult once a diagnosis has already been made. Linked to this is a phenomenon known as “premature closure,” epitomized by the adage, “when the diagnosis is made, the thinking stops.”5

While a plethora of further cognitive biases does exist, the discussion here has focused on the commonly recurring cognitive biases that emerge in the clinical setting and their impact on decision making. This approach is deemed to be of more value to early clinical learners in enhancing their appreciation for, and understanding of, the subject.

Cognitive debiasing

Medical students who will soon be encountering these very scenarios, must be adequately trained in the study of cognitive biases. There is no doubt that greater awareness marks the first step in achieving this aim, and thus, metacognition, a systematic approach to reflection, is often viewed as the lynchpin of cognitive debiasing.5

How can this be achieved? From a personal perspective, during bedside teaching, prompting students to consider how inherent biases in thinking would impact their choice of investigations and management plans helped to enhance metacognitive skills. Subsequent completion of reflective portfolio entries helped to embed these skills and served as a point of reference for the future.

Encouraging medical students to discuss diagnostic workups at ballad meetings or engage in quality improvement projects designed to tackle clinical reasoning errors are other methods to reinforce the aim of raising awareness of cognitive biases. Some even advocate formalization of these methods such that they constitute mandatory assessment in clinical learning. One must be mindful, however, that the aforementioned strategies are more suitable for small-group settings, and thus methods to teach cognitive debiasing on a larger platform are required. It has been shown that medical students receiving a seminar on cognitive biases were less likely to make bias-prone decisions, supporting the idea of formal cognitive psychology teaching.6 In this vein, diagnostic error curriculums are currently being developed and piloted in the United States. One key aspect involves familiarizing medical students with cognitive forcing strategies and checklists that are employed in daily clinical practice to limit the overuse of system 1 thinking.

Common components of these checklists include the importance of clinicians taking their own history from the patient and pausing for a “diagnostic time-out”. Taking one’s own history from the patient is a vital first step to avoid falling prey to “framing biases” from previously formulated diagnoses. Pausing for a “diagnostic time-out” is arguably the most important aspect of the checklist. It ensures clinicians reevaluate diagnoses throughout the patient journey and allows them to answer the question - “could this be something else?”

Given that one study found that a lack of formulation of a differential diagnosis was implicated in 80% of diagnostic error cases, the use of differential diagnosis checklists as an adjunct at this stage may be valuable.7 In addition to consulting such checklists, a diagnostic time-out may involve challenging the diagnosis that originally came to mind (circumventing availability bias), interpreting investigation results with a fresh perspective (limiting confirmation bias), or keeping the diagnostic process open despite preliminary investigation results suggesting a particular diagnosis (tackling premature closure).

Challenges of cognitive debiasing

Circumventing cognitive biases through the aforementioned strategies is challenging; inherent psychological defense mechanisms shield our cognitive processes from self-analysis and critique. The “blind spot bias” facilitates assessing the impact of cognitive biases on the reasoning of those around us, but hampers our ability to recognize their impact on our own thinking. Similarly, evaluating outcomes through the gracious lens of hindsight, the so-called “hindsight bias”, impedes our ability to accurately appraise our handling of a given scenario.5

The art of cognitive debiasing takes time to develop, and thus medical educators must sow the seeds in tomorrow’s doctors before they assume a frontline position. Medical educators must equip medical students with the tools for understanding and circumventing cognitive biases, in the hope of nurturing conscientious clinicians who can optimize patient outcomes through improved diagnostic accuracy.

Disclosure

The author reports no conflicts of interest in this work.

References

1.

Higgs J, Elstein A. Clinical reasoning in medicine. In: Higgs J, editor. Clinical Reasoning in the Health Professions. Oxford, England: Butterworth-Heinemann Ltd; 1995:49–59.

2.

General Medical Council. Tomorrow’s Doctors: Outcomes and Standards for Undergraduate Medical Education. 2009. Available from: http://www.gmc-uk.org/Tomorrow_s_Doctors_1214.pdf_48905759.pdf. Accessed August 12, 2017.

3.

Ely JW, Graber ML, Croskerry P. Checklists to reduce diagnostic errors. Acad Med. 2011;86(3):307–313.

4.

Tay SW, Ryan P, Ryan CA. Systems 1 and 2 thinking processes and cognitive reflection testing in medical students. Can Med Educ J. 2016;7(2):e97–e103.

5.

Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003;78(8):775–780.

6.

Hershberger PJ, Markert RJ, Part HM, Cohen SM, Finger WW. Understanding and addressing cognitive bias in medical education. Adv Health Sci Educ Theory Pract. 1996;1(3):221–226.

7.

Singh H, Giardina TD, Meyer AND, Forjuoh SN, Reis MD, Thomas EJ. Types and origins of diagnostic errors in primary care settings. JAMA Intern Med. 2013;173(6):418–425.

Creative Commons License © 2018 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.