Adv in Health Sci Educ DOI 10.1007/s10459-014-9518-4 REFLECTIONS

Reexamining our bias against heuristics Kevin McLaughlin • Kevin W. Eva • Geoff R. Norman

Received: 23 May 2014 / Accepted: 23 May 2014 Ó Springer Science+Business Media Dordrecht 2014

Abstract Using heuristics offers several cognitive advantages, such as increased speed and reduced effort when making decisions, in addition to allowing us to make decision in situations where missing data do not allow for formal reasoning. But the traditional view of heuristics is that they trade accuracy for efficiency. Here the authors discuss sources of bias in the literature implicating the use of heuristics in diagnostic error and highlight the fact that there are also data suggesting that under certain circumstances using heuristics may lead to better decisions that formal analysis. They suggest that diagnostic error is frequently misattributed to the use of heuristics and propose an alternative view whereby content knowledge is the root cause of diagnostic performance and heuristics lie on the causal pathway between knowledge and diagnostic error or success.

A case of ‘‘faulty heuristics’’ The patient, a 52 year-old Asian male who smoked and had a 15-year history of diabetes and poorly controlled hypertension, arrived in the Emergency Room less than thirty minutes ago. His use of English was limited, but through his daughter he reported that his chest pain came on suddenly, was very severe, and radiated through to his back and down his left arm. His electrocardiogram showed inferior ST segment elevation. The resident, for whom this was the fourth patient with chest pain that he had assessed in the past three hours, diagnosed an acute ST segment elevation

K. McLaughlin (&) Office of Undergraduate Medical Education, Health Sciences Centre, University of Calgary, 3330 Hospital Drive NW, Calgary, AB T2N 4N1, Canada e-mail: [email protected] K. W. Eva University of British Columbia, Vancouver, BC, Canada G. R. Norman McMaster University, Hamilton, ON, Canada

123

K. McLaughlin et al.

myocardial infarction (STEMI) and promptly arranged for him to be sent to the cath lab. When the patient, accompanied by his daughter, arrived at the cath lab the oncall cardiologist quickly assessed him, arranged for her colleague to perform a confirmatory investigation, after which she consulted the on-call cardiac surgeon for urgent repair of the patient’s aortic dissection. On reviewing the case with the resident, the cardiologist felt that the resident had placed too much weight on the patient’s vascular risk factors and had not considered alternative conditions that may be associated with chest pain and ST segment elevation. She concluded that the incorrect diagnosis was due to the resident’s use of one or more faulty heuristics. A heuristic is a strategy, applied implicitly or deliberately, that leads us to decisions using only part of the information that might be otherwise available. The result is that decision making is quicker, requires less effort, and can still proceed where missing data do not allow for formal reasoning, such as applying logic or Bayesian probabilities (Gigerenzer and Gaissmaier 2011). We use heuristics whenever we make decisions based on recognition, or if we intentionally limit the amount of data that we process—for example, when we take-the-best single piece of data or employ a fast-and-frugal approach (Gigerenzer and Gaissmaier 2011; Martignon and Hoffrage 2002). And we teach our children heuristics by telling them to be wary of strangers, to focus on the pitcher’s motion and spin of the ball to predict where the baseball will cross the plate, and to sell high and buy low. Other common heuristics include trade-off (e.g., when we consider quality, cost, and time before embarking on a project (Babu and Suresh, 1996; Dawes 1979)), and social heuristics where we might imitate-the-successful or accept the wisdom of the crowd (Hertwig and Herzog 2009; Yi et al. 2012). The traditional view of heuristics is that they trade accuracy for efficiency (Croskerry 2009; Sloman 1996). Consistent with this, when diagnostic errors are picked through there is typically a strong odour of heuristics (Graber et al. 2005), and both the medical and psychology literature tend to emphasize the perils of heuristic use (Croskerry 2013; Graber 2005; Tversky and Kahneman 1974). Indeed, parallel terminology is available to lead us—if armed with the ‘‘wisdom’’ of hindsight—to define heuristics based on their suboptimal outcome. When this happens, heuristics can be construed as biases rather than biases being treated as one potential outcome of heuristic use. Thus, recognition becomes representativeness restraint, take-the-best becomes anchoring with inadequate adjustment, trade-off becomes base-rate neglect, and the wisdom of the crowd is referred to as group conformity bias or diagnostic momentum (Asch 1955; Croskerry 2003a). But do heuristics really cause diagnostic errors? In this essay we will examine the case for heuristics as a cause of diagnostic error and suggest that while heuristics frequently lie on the causal pathway to diagnostic error they are not the underlying cause. Instead, we propose an alternative view of the relationship between heuristics and diagnostic error in which the root causes of diagnostic error frequently masquerade as ‘‘faulty’’ heuristics and are overlooked due to our tendency to anchor on heuristics as the cause of diagnostic error.

The evidence implicating heuristics in diagnostic error The primary source of data on the hazards of heuristic use in medicine is the literature on diagnostic error, which largely comprises a retrospective series of autopsy or medicolegal cases (Gandhi et al. 2006; Graber 2005; Graber et al. 2005; Kachalia et al. 2007; Kirch and Schafii 1996). This type of study design does not allow us to calculate the odds or risk of

123

Reexamining our bias

diagnostic error associated with heuristic use, and is prone to both sampling and measurement biases. The most obvious sampling issue is that there is a known outcome, which is always adverse. Knowledge of an outcome influences the evaluation of decisions made (outcome bias or hindsight bias), so any diagnostic decision is more likely to be rated poorly when an adverse event is known to have occurred (Baron and Hershey 1988; Yopchick and Kim 2012). Issues related to measurement include exposure suspicion bias, whereby knowledge of the outcome increases the intensity of the search for evidence of exposure, and recall bias—both of which tend to exaggerate the association between exposure and outcome (Sackett 1979). In a recent prospective study involving over 2,000 diagnostic decisions by pathologists and pathology trainees, Crowley and colleagues concluded that approximately half of the cases in which a diagnostic error occurred were associated with one or more of the biases that they screened for (Crowley et al. 2013). What is still unclear, however, is whether these same heuristics are used more or less frequently in cases where a diagnostic error did not occur. Psychology experiments provide additional data on heuristics and have clearly demonstrated that heuristic use is error prone when compared to statistical prediction models (Tversky and Kahneman 1974). However, in many of these studies the specific goal was to create situations that would increase the likelihood of errors associated with heuristic use precisely for the purpose of illustrating their existence (Lopez 1992). Participants were assumed to be using heuristics whenever they made incorrect choices or probability estimates under carefully and creatively designed conditions. While these psychology experiments, and similar experiments in medical education (Mamede et al. 2010, 2014; Schmidt et al. 2014), were very effective in enabling the detection of heuristics and biases, they do not inform us directly on the outcomes of using heuristics during naturalistic decision-making, and most courts would exclude such data on the grounds of entrapment (Lord 1998; Todd and Gigerenzer 2001).

The case for the defence Countering the data implicating heuristic use in erroneous decision-making are studies suggesting that in some situations heuristics use is associated with better decision making. Shanteau observed the decision-making of experts and non-experts in a variety of domains (including radiology) and concluded that experts do not process more data than nonexperts (Shanteau 1992). Instead, experts typically base their decisions on fewer pieces of data that are more discriminating. The findings from a series of studies performed by Dijksterhuis and colleagues suggest that while conscious deliberation is associated with good choices when presented with simple decisions, deliberation-without-attention may lead to better choices for complex decisions (Dijksterhuis et al. 2006). And Dieckmann and Rieskamp found that when a large amount of data were presented to university students who were then asked to choose between potential oil drilling sites, those instructed to takethe-best piece of data performed better than those who integrated different sources of data (Dieckmann and Rieskamp 2007). Several groups have also demonstrated a less-is-more effect when making medical diagnoses (Coderre et al. 2003; Green and Mehr 1997; Kulatunga-Moruzi et al. 2004; Reyna and Lloyd 2006; Sherbino et al. 2012). The cognitive advantage of heuristics in these studies may be explained by the fact that working memory has a restricted capacity and information processing strategies that respect this limitation are less likely to lead to cognitive overload (Cherubini and Mazzocco 2004).

123

K. McLaughlin et al.

The merits of heuristics are also touted by those who study naturalistic decision making in experts. In many fields of expertise the ability of some individuals to consistently make superior decisions appears to be explained by their skill in rapidly processing data and recognizing patterns that are not identified by others (Kahneman and Klein 2009; Klein 1998). While the naturalistic decision making literature is prone to the same biases as the diagnostic error literature (which in this case may lead to successful decision making being falsely attributed to heuristic use), we can at least infer from these studies that the use of heuristic does not necessarily result in poorer decisions, and that the relationship between heuristics and diagnostic error cannot, therefore, be considered as causal (Bradford-Hill 1965). In fact, the safest conclusion to draw is that despite years of study and the generation of many important insights, we do not know the extent to which heuristics impede or facilitate accurate decision-making.

Anchoring on heuristics: a case of fundamental attribution error? If the association between heuristic use and diagnostic error is non-causal, why are heuristics implicated so frequently in diagnostic error? The likely explanation for this is that, given their ubiquitous use, when we look backwards along the causal pathway of diagnostic error heuristics loom large as a proximate cause. Indeed, if we looked backwards along the causal pathway of diagnostic success or expertise we would expect to find the same thing. Our case of chest pain illustrates this, as it is likely that both resident and cardiologist began with the same recognition-based heuristic—but ended with different diagnoses. It is clearly incongruous that the same heuristic can cause both diagnostic error and success, which suggests that the effect of heuristics on diagnostic performance is conditional upon other variables. These variables are the root cause of diagnostic error (or success)—but we can only identify these if we look beyond heuristics.

Knowledge: the root cause of diagnostic performance The study of medical knowledge is complicated by the fact that this exists in several forms, such as the rules and hierarchal networks of semantic memory, instance scripts of episodic memory, and illness scripts of implicit memory, each of which can be applied to a clinical problem in isolation or simultaneously using a variety of conscious and unconscious processes (Croskerry 2009; Kahneman 2011; Reyna and Lloyd 2006; Sherry and Schacter, 1987; Sloman 1996; Smith and DeCoster 2000). Heuristics, for example, can be employed intentionally (as in the case of clinical decision rules) or subconsciously, and may contain logical rules, implicit associations, or both. Early studies in medical education aimed at identifying the source of the superior diagnostic ability of experts concluded that variance in diagnostic performance was explained by differences in content knowledge rather than process (Barrows et al. 1982; Elstein et al. 1978). Knowledge disparities also explain the low correlation of performance across different problems for any individual, and why physicians tend to be more successful when diagnosing in content areas where their knowledge base is replete (Perkins and Salomon 1989). In short, the application of knowledge cannot be separated from knowledge and when a heuristic or any other cognitive process is applied with extensive and integrated knowledge the outcome is likely to be better than when deficient knowledge is applied.

123

Reexamining our bias

Confounders and effect modifiers Although underlying knowledge is the major determinant of the impact of heuristics on diagnostic performance, the effect on performance may be confounded or modified by other variables, including task difficulty, meta-level reasoning, and luck. Figure 1 illustrates our proposed relationship between these variables. Task difficulty, as alluded to in Fig. 1, is primarily determined by the knowledge that one brings to bear on the task, but is also influenced by the underlying condition (e.g., common conditions are usually easier to diagnose than rare ones), typicality of the presentation, and how well data are communicated. Furthermore, contextual factors can contribute to task difficulty, such as challenges created by the healthcare system (e.g., lack of resources, equipment failure, and poor coordination of care), performance of other team members, and conditions that impair cognition, such as fatigue, time constraints, age, distraction, and illness (Graber et al. 2005). In the act of diagnosing, other information processing strategies frequently work alongside heuristics and can modify their effect on diagnostic performance (Croskerry 2009; Mamede et al. 2010). Klein described the process of recognition-primed decisionmaking (RPD) whereby experts can quickly recognize a potential solution to a problem— but before embarking upon this course of action they imagine the consequences of their heuristic-based decision and then revise their decision if they consider the consequences suboptimal (Klein 1998). Others refer to this type of meta-level reasoning as mindfulness or metacognition (Epstein 1999; Croskerry 2003b). By contrast, overconfidence is characterized by failure to process additional information, seeking out only confirmatory information, and/or ignoring contradictory information (Berner and Graber 2008; Croskerry and Norman 2008). Again, however, meta-level reasoning is not invariably helpful, nor overconfidence consistently harmful, as this type of reasoning is simply another way to apply stored knowledge to a problem and its effect on diagnostic performance depends upon the quality of knowledge applied (Fig. 1). In clinical practice all cases may be affected by an element of ‘‘luck’’—so the right outcome can sometime be generated by the wrong actions and the wrong outcome can be derived from the right actions. For example, we frequently perform ‘‘routine’’ history taking, physical examination, and investigations and the results of these may point us towards or away from the true diagnosis. Patient’s history and physical examination findings can change over time, and we may have the misfortune of examining a patient when abnormal physical findings are no longer present. Or, alternatively, we may astound our colleagues with our clinical acumen simply by having the good fortune of seeing a similar presentation of a rare condition recently.

Readjusting our view of heuristics If we misunderstand the relationship between heuristics and diagnostic performance we run the risk of embarking upon misguided strategies to reduce diagnostic error. Heuristics are neither faulty nor faultless; they are simply one way of applying stored knowledge to a novel problem and, therefore, lie on the causal pathway between knowledge and diagnostic performance. Rather than blaming or avoiding heuristics we should simply acknowledge their role in diagnosing. Heuristics allow us to begin the diagnostic process despite missing data and enable us to diagnose complex cases without inducing cognitive overload. The ability to use heuristics to good effect is associated with expertise (Shanteau 1992; Norman

123

K. McLaughlin et al.

Diagnostic success

Task difficulty Luck

Knowledge

Heuristics

Meta-level reasoning

Diagnostic error

Fig. 1

The place of heuristics of the causal pathway from knowledge to diagnostic performance

et al. 2007)—so suppressing their use may ultimately be harmful if this retards the development of expertise. In every area of study there are strategies and techniques that are difficult to master, and performance may initially decline when applying these. Yet we encourage learners to cycle without training wheels and to type without looking at the keyboard because their performance will ultimately be enhanced if they can incorporate these strategies. Our approach to heuristics should be the same. This, of course, does not imply that we should encourage our learners to stick lazily with the first diagnosis that comes to mind, or to make diagnoses on the basis of isolated clinical findings. Instead, we should continue to try and improve the knowledge of our learners by teaching them how to use heuristics and training them to be mindful of their limitations and to calibrate heuristic decision-making (Norman et al. 2007; Norman and Eva 2010). Conclusion The utility of heuristics in diagnosing is a divisive issue because heuristic use has been associated with both diagnostic error and expertise. Finding an influence of heuristics when diagnostic performance fails does not imply causation. Heuristics may be on the causal pathway to diagnostic error, but they are not the root cause. Instead, the result of using heuristics is largely dependent upon knowledge with confounding from task difficulty, meta-level reasoning, and luck. As heuristics offer cognitive advantages—and the development of expertise might require us to master these—instead of shunning heuristics we should focus on creating the high-validity training environments and learning opportunities that allow our learners to become skilled in their application (Ericsson 2006; Hogarth 2001; Kahneman and Klein 2009).

References Asch, S. E. (1955). Opinions and social pressure. Scientific American, 193, 31–35. Babu, A. J. G., & Suresh, N. (1996). Project management with time, cost, and quality considerations. European Journal of Operational Research, 88, 320–327. Baron, J., & Hershey, J. C. (1988). Outcome bias in decision evaluation. Journal of Personality and Social Psychology, 54, 569–579. Barrows, H. S., Norman, G. R., Neufeld, V. R., & Feightner, J. W. (1982). The clinical reasoning of randomly selected physicians in general medical practice. Clinical and Investigative Medicine, 5, 49–55.

123

Reexamining our bias Berner, E. S., & Graber, M. L. (2008). Overconfidence as a cause of diagnostic error in medicine. The American Journal of Medicine, 121(Suppl 5), S2–S23. Bradford-Hill, A. (1965). The environment and disease: Association or causation? Proceedings of the Royal Society of Medicine, 58, 295–300. Cherubini, P., & Mazzocco, A. (2004). From models to rules: Mechanization of reasoning as a way to cope with cognitive overloading in combinatorial problems. Acta Psychologica, 116, 223–243. Coderre, S., Mandin, H., Harasym, P. H., & Fick, G. H. (2003). Diagnostic reasoning strategies and diagnostic success. Medical Education, 37, 695–703. Croskerry, P. (2003a). The importance of cognitive errors in diagnosis and strategies to minimize them. Academic Medicine, 78, 775–780. Croskerry, P. (2003b). Cognitive forcing strategies in clinical decisionmaking. Annals of Emergency Medicine, 41, 110–120. Croskerry, P. (2009). A universal model of diagnostic reasoning. Academic Medicine, 84, 1022–1028. Croskerry, P. (2013). From mindless to mindful practice—cognitive bias and clinical decision making. The New England Journal of Medicine, 368, 2445–2448. Croskerry, P., & Norman, G. (2008). Overconfidence in clinical decision making. The American Journal of Medicine, 121(Suppl 5), S24–S29. Crowley, R. S., Legowski, E., Medvedeva, O., Reitmeyer, K., Tseytlin, E., Castine, M., et al. (2013). Automated detection of heuristics and biases among pathologists in a computer-based system. Advances in Health Science Education, 18, 343–363. Dawes, R. M. (1979). The robust beauty of improper linear models in decision making. American Psychologist, 34, 571–582. Dieckmann, A., & Rieskamp, J. (2007). The influence of information redundancy on probabilistic inferences. Memory and Cognition, 35, 1801–1813. Dijksterhuis, A., Bos, M. W., Nordgren, L. F., & van Baaren, R. B. (2006). On making the right choice: The deliberation-without-attention effect. Science, 311, 1005–1007. Elstein, A. S., Shulman, L. S., & Sprafka, S. A. (1978). Medical problem solving: An analysis of clinical reasoning. Cambridge, MA: Harvard University Press. Epstein, R. M. (1999). Mindful practice. The Journal of the American Medical Association, 282, 833–839. Ericsson, K. A. (2006). The influence of experience and deliberate practice on the development of superior expert performance. In K. A. Ericsson, N. Charness, R. R. Hoffman, & P. J. Feltovich (Eds.), The Cambridge handbook of expertise and expert performance. New York: Cambridge University Press. Gandhi, T. K., Kachalia, A., Thomas, E. J., Puopolo, A. L., Yoon, C., Brennan, T. A., et al. (2006). Missed and delayed diagnoses in the ambulatory setting: A study of closed malpractice claims. Annals of Internal Medicine, 145, 488–496. Gigerenzer, G., & Gaissmaier, W. (2011). Heuristic decision making. Annual Review of Psychology, 62, 451–482. Graber, M. (2005). Diagnostic errors in medicine: A case of neglect. Joint Commission Journal on Quality and Patient Safety, 31, 106–113. Graber, M. L., Franklin, N., & Gordon, R. (2005). Diagnostic error in internal medicine. Archives of Internal Medicine, 165, 1493–1499. Green, L., & Mehr, D. R. (1997). What alters physicians’ decisions to admit to the coronary care unit? The Journal of Family Practice, 45, 219–226. Hertwig, R., & Herzog, S. M. (2009). Fast and frugal heuristics: Tools of social rationality. Social Cognition, 27, 661–698. Hogarth, R. M. (2001). Educating intuition. Chicago Il: University of Chicago Press. Kachalia, A., Gandhi, T. K., Puopolo, A. L., Yoon, C., Thomas, E. J., Griffey, R., et al. (2007). Missed and delayed diagnoses in the emergency department: A study of closed malpractice claims from 4 liability insurers. Annals of Emergency Medicine, 49, 196–205. Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus & Giroux. Kahneman, D., & Klein, G. (2009). Conditions for intuitive expertise: A failure to disagree. American Psychologist, 64, 515–526. Kirch, W., & Schafii, C. (1996). Misdiagnosis at a university hospital in 4 medical eras. Medicine, 75, 29–40. Klein, G. (1998). Sources of power: How people make decisions. Cambridge, MA: The MIT Press. Kulatunga-Moruzi, C., Brooks, L. R., & Norman, G. R. (2004). Using comprehensive feature lists to bias medical diagnosis. Journal of Experimental Psychology. Learning, Memory, and Cognition, 30, 563–572. Lopez, L. L. (1992). Three misleading assumptions in the customary rhetoric of the bias literature. Theory and Psychology, 2, 231–236.

123

K. McLaughlin et al. Lord, K. M. (1998). Entrapment and due process: Moving toward a dual system of defenses. Florida State University Law Review, 25, 463–518. Mamede, S., van Gog, T., van den Berge, K., Rikers, R. M., van Saase, J. L., van Guldener, C., et al. (2010). Effect of availability bias and reflective reasoning on diagnostic accuracy among internal medicine residents. The Journal of the American Medical Association, 304, 1198–1203. Mamede, S., van Gog, T., van den Berge, K., van Saase, J. L., & Schmidt, H. G. (2014). Why do doctors make mistakes? A study of the role of salient distracting clinical features. Academic Medicine, 89, 114–120. Martignon, L., & Hoffrage, U. (2002). Fast, frugal, and fit: Simple heuristics for paired comparisons. Theory and Decision, 52, 29–71. Norman, G. R., & Eva, K. W. (2010). Diagnostic error and clinical reasoning. Medical Education, 44, 94–100. Norman, G., Young, M., & Brooks, L. (2007). Non-analytical models of clinical reasoning: The role of experience. Medical Education, 41, 1140–1145. Perkins, D. L., & Salomon, G. (1989). Are cognitive skills context-bound? Educational Researcher, 18, 6–25. Reyna, V. F., & Lloyd, F. J. (2006). Physician decision making and cardiac risk: Effects of knowledge, risk perception, risk tolerance, and fuzzy processing. Journal of Experimental Psychology: Applied, 12, 179–195. Sackett, D. L. (1979). Bias in analytic research. Journal of Chronic Diseases, 32, 51–63. Schmidt, H. G., Mamede, S., van den Berge, K., van Gog, T., van Saase, J. L., & Rikers, R. M. (2014). Exposure to media information about a disease can cause doctors to misdiagnose similar-looking clinical cases. Academic Medicine, 89, 285–291. Shanteau, J. (1992). How much information does an expert use? Is it relevant? Acta Psychologica, 81, 75–86. Sherbino, J., Dore, K. L., Wood, T. J., Young, M. E., Gaissmaier, W., Kreuger, S., et al. (2012). The relationship between response time and diagnostic accuracy. Academic Medicine, 87, 785–791. Sherry, D. F., & Schacter, D. L. (1987). The evolution of multiple memory systems. Psychological Review, 94, 439–454. Sloman, S. A. (1996). The empirical case for two systems of reasoning. Psychological Bulletin, 119, 3–22. Smith, E. R., & DeCoster, J. (2000). Dual-process models in social and cognitive psychology: Conceptual integration and links to underlying memory systems. Personality and Social Psychology Review, 4, 108–131. Todd, P., & Gigerenzer, G. (2001). Putting naturalistic decision making into the adaptive toolbox. Journal of Behavioral Decision Making, 14, 381–383. Tversky, A., & Kahneman, D. (1974). Judgement under uncertainty: Heuristics and biases. Science, 185, 1124–1131. Yi, S. K., Steyvers, M., Lee, M. D., & Dry, M. J. (2012). The wisdom of the crowd in combinatorial problems. Cognitive Science, 36, 452–470. Yopchick, J. E., & Kim, N. S. (2012). Hindsight bias and causal reasoning: A minimalist approach. Cognitive Processing, 13, 63–72.

123

Reexamining our bias against heuristics.

Using heuristics offers several cognitive advantages, such as increased speed and reduced effort when making decisions, in addition to allowing us to ...
258KB Sizes 2 Downloads 3 Views