Special Section: Open Forum

The Ethical Imperative to Think about Thinking Diagnostics, Metacognition, and Medical Professionalism MEREDITH STARK and JOSEPH J. FINS Abstract: While the medical ethics literature has well explored the harm to patients, families, and the integrity of the profession in failing to disclose medical errors once they occur, less often addressed are the moral and professional obligations to take all available steps to prevent errors and harm in the first instance. As an expanding body of scholarship further elucidates the causes of medical error, including the considerable extent to which medical errors, particularly in diagnostics, may be attributable to cognitive sources, insufficient progress in systematically evaluating and implementing suggested strategies for improving critical thinking skills and medical judgment is of mounting concern. Continued failure to address pervasive thinking errors in medical decisionmaking imperils patient safety and professionalism, as well as beneficence and nonmaleficence, fairness and justice. We maintain that self-reflective and metacognitive refinement of critical thinking should not be construed as optional but rather should be considered an integral part of medical education, a codified tenet of professionalism, and by extension, a moral and professional duty. Keywords: medical decision making; medical ethics; professionalism; medical education; medical error; diagnostic error; patient safety; cognition; judgment; metacognition

Everyone complains of his memory, and no one complains of his judgment. François de la Rochefoucauld1

Introduction: Inattention to Thinking Errors in Medicine While the medical ethics literature has well explored the harm to patients, families, and the integrity of the profession in failing to disclose medical errors once they occur,2,3 less often addressed are the moral and professional obligations to take all available steps to prevent errors and harm in the first instance. We think about how to fairly and responsibly handle mistakes after the fact, but do we think as extensively about the origins of these errors, particularly when they may result from potentially correctable flaws in physician thinking and judgment? As an expanding body of scholarship further elucidates the causes of medical error, including the considerable extent to which medical errors, particularly in diagnostics, may be attributable to cognitive sources,4 insufficient progress in systematically evaluating and implementing suggested strategies for improving critical reasoning is of mounting concern.5 We maintain that self-reflective refinement of critical thinking should not be construed as optional but rather should be considered an integral part of medical education, a codified tenet of professionalism, and by extension, a moral and professional duty.6 Given what’s at stake, it seems implausible that there could be ground yet to cover on the vital road of preventing grave medical errors and resultant harm to patients. Regrettably, such may be the case. While much of the scholarship arguing for

386

Cambridge Quarterly of Healthcare Ethics (2014), 23, 386–396. © Cambridge University Press 2014. doi:10.1017/S0963180114000061

The Ethical Imperative to Think about Thinking attention to cognitive errors in diagnostics has underscored the gravity of the charge by invoking the 1999 Institute of Medicine (IOM) report on medical errors,7 there remains sparse evidence of change in the more than a decade since.8 Notwithstanding that several years have elapsed since Jerome Groopman drew widespread attention to cognitive errors in medicine in his 2007 bestseller, How Doctors Think, his appeals to systematically incorporate concerted thinking about thinking in medicine remain largely unheeded.9 Sadly, this lack of progress seems not to be a case of no stone unturned. Instead, plausible avenues for change have been substantially overlooked, many emanating from a pivotal line of scholarship applying tools and insights from cognitive psychology toward understanding and improving critical thinking in medicine.10,11 For some years now, these scholarly examinations have concluded with pleas for specific changes to education and practice to improve clinical reasoning and reduce thinking errors, particularly in diagnostics.12,13 As the years pass, and an impressive body of literature accumulates, the continued lack of progress in systematically assessing and applying recommended strategies for addressing cognitively sourced diagnostic errors both threatens patient health and challenges ethical ideals. Curiously, although a robust patient safety movement has successfully addressed other forms of medical error in recent years, attention to diagnostic errors has lagged. In a 2010 article in Health Affairs, physician Robert Wachter noted that although medication error was mentioned 70 times in the aforementioned 1999 IOM report, To Err Is Human: Building a Safer Health System,14 diagnostic errors were mentioned only twice, particularly puzzling given the higher incidence and greater harm attributable to diagnostic error.15 As Wachter notes, the issue of “diagnostic errors has been strangely absent from the flurry of patient safety activity over the last decade. This absence is particularly noteworthy given the frequency of these errors.”16 A 2009 JAMA commentary by David Newman-Toker and Peter Pronovost similarly bemoaned the patient safety movement’s narrow focus “on translating evidence into practice, mitigating hazards from therapies, and improving culture and communication. Diagnostic errors have received relatively little attention . . . [though they remain] an important source of preventable harm.”17 Diagnostic errors may be preventable, in large part, because they so often derive from systems and cognitive sources. A seminal 2005 analysis of diagnostic errors by Mark Graber and colleagues, published in the Archives of Internal Medicine, found that “system-related factors contributed to the diagnostic error in 65% of the cases and cognitive factors in 74% . . . [uncovering] dominant problems that should be targeted for additional research” toward early error reduction.18 The lag or lack of attention is even more puzzling given the toll—financial and human— attributable to diagnostic error. A 2013 study retrospectively analyzing 350,706 paid claims from the National Practitioner Data Bank over a 25-year period found that diagnostic errors were the leading claim type (accounting for 28.6% of claims), led the various claim types in terms of costs (accounting for 35.2% of total payments), and were the foremost cause of claims-related death and disability.19 The study authors conclude, “Among malpractice claims, diagnostic errors appear to be the most common, most costly and most dangerous of medical mistakes. . . . Healthcare stakeholders should consider diagnostic safety a critical health policy issue.”20 Beyond arguments predicated on patient safety (which should be sufficiently compelling) and aggregate monetary costs (which should be sufficiently motivating),21 delaying or failing to engage in efforts that carry the potential to reduce a leading

387

Meredith Stark and Joseph J. Fins cause of error-related morbidity and mortality seems ethically problematic as well. This latter objection provides the impetus for the present examination. As a foundation for this analysis, we begin with a very brief overview of the theoretical underpinnings of this work and the application of tools and insights from cognitive sciences toward understanding medical decisionmaking and mitigating error. We then outline the ethical ramifications of sustained inattention to this issue and argue for an ethical mandate for applying the very latest about what is known about critical thinking, judgment, and reasoning toward addressing errors and improving medical decisions and care. Our examination concludes with brief attention to three lasting hurdles, offered with an eye toward facilitating action and overdue progress in addressing preventable thinking errors in medicine. Cognitive Science and Thinking Errors in Medicine In his 2011 bestseller, Thinking Fast and Slow, psychologist and 2002 Nobel Prize winner in economics Daniel Kahneman delves deeply into the prevailing dualprocess approach to thinking and reasoning.22 Building on groundbreaking work in cognitive heuristics and biases conducted with his longtime colleague, Amos Tversky, four decades earlier,23 Kahneman explains, for a lay audience, the appeal and science behind this dual-process explanation of fast and slow thinking, also conceptualized as system 1 and system 2, or intuitive and analytic, thinking.24,25 Although the model is considerably more nuanced than its binary schema might suggest, in simplest terms, slow or system 2 thinking (solving a complex math problem or trying to navigate a turn across heavy traffic) is deliberative, systematic, and resource heavy, whereas fast or system 1 thinking (identifying colors or facial expressions or performing simple math) is frugal and automatic, marked by rapid application of heuristics to complete patterns and achieve associative coherence.26 Over the last decade, scholars have applied these theories and tools, particularly the dual-process model and insights on prevalent biases and heuristics in human problem solving, to the important work of understanding medical decisionmaking and, by extension, medical errors.27,28,29 Following up on an earlier paper examining an array of heuristics and biases in conjunction with diagnostic errors in emergency department physicians,30 in 2008 physician Pat Croskerry, a leading scholar in the field, wrote with his colleague Geoff Norman, in the American Journal of Medicine, of the hazards to diagnostic thinking arising from a common bias toward overconfidence.31 The authors conclude that this widespread bias substantially contributes to thinking errors in medicine, and they advise that it “should be possible to improve clinical reasoning through specific training and thus reduce the prevalence of biases such as overconfidence.”32 In 2009 Croskerry mapped out a comprehensive “universal model of diagnostic reasoning” that charted system 1 and 2 processing, in reference to diagnostic thinking, further integrating into the model cognitive and metacognitive constructs, including biases and heuristics, pattern recognition, monitoring, and calibration of accuracy.33 While acknowledging the need for further substantiating research, Croskerry reinforces that “clinicians in training and those already in practice need a comprehensive approach to clinical decision making that facilitates their understanding of this complex process and allows them to gain insight and understanding into their own decision making.”34 More recent work in modeling medical decisions further supports the application of the dual-process theory,

388

The Ethical Imperative to Think about Thinking and cognitive and metacognitive constructs, toward understanding and improving decisionmaking.35,36 With attention still limited, in spite of a growing publication record in leading medical journals, in June 2013, Croskerry authored a perspective piece in the New England Journal of Medicine urging a shift in the medical community, namely, as expressed by the article’s title, “From Mindless to Mindful Practice—Cognitive Bias and Clinical Decision Making.”37 Predicating his argument on the dual modes of intuitive and analytic thinking, Croskerry describes two case examples of diagnostic failures and prescriptively encourages training in critical thinking, including concerted efforts to think about how we think (in psychological parlance, metacognition38), toward fostering “judicious interventions by the analytic mode when needed— specifically, in its capacity to override the intuitive mode” and further toward the development of “effective cognitive debiasing strategies in medicine.”39 With respect to plausible avenues for corrective action, in addition to debiasing strategies,40,41 recommendations for addressing thinking errors in medicine include the following: training in cognitive aspects of decisionmaking and critical thinking skills throughout the stages of medical education;42,43 training in and vital fiats to include metacognitive constructs—among them, reflective thinking, monitoring, and control functions—for assessing and continually refining one’s own thinking;44,45,46,47,48 and, at the systems level, recommendations for structural changes to augment feedback so as to facilitate error analysis,49,50,51 alongside procedural changes at key decision points, ranging from complex computerized decision supports to simple checklists.52,53,54,55 Yet, even with scholars effectively pounding the table with alarming statistics about the harm attributable to thinking errors in diagnostics, attention to this line of inquiry has lagged. Without a commitment to researching, evaluating, and thereafter implementing promising strategies on the part of those who set agendas, curricula, budgets, and research priorities, the path to meaningful change remains considerably obstructed. Frustratingly, for individual physicians and educators motivated by these cognitive science findings to improve diagnostic thinking, progress is impeded by the lack of rigorous research as to how diagnostic decisionmaking transpires, and as to which of the suggested strategies might best counteract the relevant heuristics or flaws. As Robert Wachter concludes, we continue to lack solid evidence as to whether “proposed solutions to diagnostic error work, partly because they have been so little studied.”56 Inaction as Ethically Problematic In a 2009 address to the American Academy of Arts and Sciences, reprinted in 2010, Jerome Groopman said, “I believe the time has come to incorporate cognitive science into the education of medical students and physicians. We need to know how we think and why we often, too often, think incorrectly.”57 In addition to patent risks to patient safety, we believe that lackluster attention to critical thinking skills and thinking errors in medicine increasingly carries ethical implications as well. With respect to understanding the ethical ramifications of continued inaction, at the outset, we must acknowledge not only a shared responsibility to disclose

389

Meredith Stark and Joseph J. Fins errors after the fact but also a further obligation to reflect on these errors and take any and all available steps to prevent such errors in the future. From a principles point of view,58 a situation in which plausible avenues for remedying mistakes remain untested, as grave and preventable harm to patients knowingly continues, evokes arrant concerns of nonmaleficence. Whether this frame of inaction in the face of viable remedies is an accurate depiction of the present (or an imminent) state of affairs may be up for debate. At the very least, it will be increasingly difficult for defenders of the status quo to claim that every step is being taken to avoid thinking errors in medicine, particularly in diagnostics, should the recommendations of a decade-long line of scholarship continue to be largely ignored. Naturally, as obligations to patients extend considerably beyond mere avoidance of harm, also entangled here are concerns arising from ethical mandates of beneficence, and how this inattention likewise compromises good patient care. Further intertwined are ethical arguments arising from medical professionalism and duties to colleagues, the community, and the honor of the profession. Dating back to the classical version of the Hippocratic Oath, the medical profession has upheld commitments beyond good patient care and “keeping patients safe from harm,” to include the avoidance of injustice, the promotion of trust in the doctorpatient relationship, and the espousal of duties to the community and the profession.59 In this case, patent moral obligations to patients are joined by duties to the art and craft of medicine, to current and future generations of physicians, to the profession, and to society. Neglecting to fully explore viable avenues for remedying a leading source of medical error seemingly runs contrary to the central tenets of long-established professional codes. In enumerating the ethical pitfalls of continuing on the current path, it is important to consider issues of justice and fairness as well. Along with the patent unfairness of missing opportunities to prevent errors and grave associated harm, there may further be justice issues in the distribution of error and associated harms. Although supporting data on this front is scant, it seems quite possible that the burdens of medical error (and likely limitations on avenues of recourse for addressing errors after the fact) may disproportionately fall to vulnerable groups. The distribution of cognitive-based errors, particularly those rooted in the employment of rapid heuristics, biases, and assumptions, may thus serve to create new or to exacerbate existing disparities in care. Given that physicians have espoused duties to combating injustice, this duty surely must extend to remedying even inadvertent iatrogenic injustices. The late physician and ethicist Ed Pellegrino, in discussing the limits of the Hippocratic Oath, emphasized the social responsibility of the physician, advancing that “medicine, which touches on the most human problems of both the individual and society, cannot serve man without attending to both his personal and communal needs.”60 Lamenting the prevailing narrow focus, in a speech delivered a decade earlier, Pellegrino began, “Historically and conceptually we have been accustomed to think of medical ethics in terms of the obligations of individual physicians to individual patients. We have largely neglected the obligations of the profession as a moral community . . . some of the issues that vex us most are resolvable only through the use of our collective moral power.”61 Underscoring the force of this expanded view, Pellegrino concludes, “Every one of the things that disturb the conscientious physician—in his colleagues or in today’s social milieu—can be effectively opposed if we are faithful to the central aim of medicine:

390

The Ethical Imperative to Think about Thinking the care and cure of the sick. . . . There is enormous moral power in this position but we have not used it.”62 When appeals go out in 201363 for attention to systematic errors, that look an awful lot like those of decades past,64,65 this continued privation may be seen to compromise the “social ethics”66 of medicine as well. Although the possible ethical pitfalls stemming from inaction may be rather plain, considerably less clear are the whys behind this apparent lapse. It remains unknown whether there exists a pervasive underestimation of the extent to which diagnostic errors are attributable to thinking errors, or whether the problem is appreciated, but plausible avenues for amelioration (cognitive and metacognitive training) have been dismissed as soft or optional or outside the scope of professional responsibility. Similarly, it is unclear whether there is simply a lack of awareness among stakeholders of this rich body of scholarship and the strategies for improving thinking (unlikely given widespread publication in medical journals over the last decade); whether the methodological complexities of assessing decisionmaking and proposed interventions have stymied progress; or whether the work of introspection, of acknowledging the possibility of error in one’s own thinking, is thought too bland or cumbersome especially given the comparative ease of employing the latest technology.67 Whether the entangling of malpractice claims68 with thinking errors in diagnostics may pose complex conflicts of interest that may also serve to impede attention, is likewise thoroughly unknown. Far more transparent, however, is the ethical upshot of continued avoidance, the extent to which sustained inertia runs contrary to principles of beneficence and nonmaleficence, and to themes of fairness and justice. Whatever the reason for the lag, the present situation seems untenable, and inconsistent with prevailing notions of ethics and medical professionalism. The idea that individuals may continue to be harmed, and lives lost, because avenues for fixing fundamental thinking errors are, for whatever reason, not prioritized, to the extent that they are even considered at all, seems fundamentally unjust. Taking liberties with the transitive property, if a leading cause of error-related morbidity and mortality is diagnostic error,69 and if diagnostic errors are, in large part, attributable to cognitive errors,70 then some significant fraction of errorinduced death and disability is likely attributable to thinking errors in medicine. If these constituent facts were not widely disseminated, the ethical calculus might be different. But as they are, as the relevant research has appeared in leading medical journals for years, continued inattention is increasingly problematic, with the consequences of inaction compounded, in large part, because we now have plausible remedies. Remaining Hurdles In moving this issue from weary lamentation to action, it seems important to briefly address three enduring tensions that may be impeding forward progress. The first involves a tension between deductive and inductive scholarly approaches to translating insights from cognitive psychology and behavioral economics into established interventions for improving thinking in medicine. Given the aforementioned shortage of empirical studies,71 much of the relevant literature has necessarily approached the issue from a deductive standpoint, reasoning from theory how medical decisionmaking optimally should function.72,73,74 Although such theoretical work is invaluable, a lasting impediment to action may be the

391

Meredith Stark and Joseph J. Fins paucity of inductive approaches, efforts to understand how medical thinking operates from experience up rather than from theory down. As medical diagnostics is a distinctly inductive process, the need for empirical research is even more acute. Furthermore, without this solid grounding in real-world reasoning, the risk of narrow interpretations grows, as deductions from theory remain unopposed by lessons from practice. Meanwhile, the need for further research, toward understanding how diagnostic decisions unfold, and how to address enduring sources of error and improve critical reasoning, steadily grows. Newman-Toker and Pronovost attributed the lag in this research to the “complexity of diagnostic problems and relative infancy of methods to study misdiagnosis, combined with limited funding for research in diagnostic safety.”75 Although these lines of inquiry are fraught with challenges, including the inherent complexities of assessing thinking, lasting change may require an elusive meet-in-the-middle approach, akin to Rawls’s reflective equilibrium,76 a bridging of inductive and deductive reasoning, scientific theory and human art. A second tension suffusing this work involves concerns of process and outcome in conceptualization, measurement, and evaluation. Critical thinking, active listening, diagnostic reasoning, and medical assessment are complex processes. Yet we most often draw conclusions about the quality of these dynamic processes by the static accuracy of the answers. Reminiscent of the childhood mathematics-class mandate to “show your work,” sound processes may, at times, lead to incorrect answers, and correct answers may betray faulty reasoning. In examining this endemic privileging of outcomes over process, short answer over story, in a discussion of certifying competencies in medicine, Fins and colleagues warn of the “parcelation of history from histology and narrative from nosology” and express concern for the consequent impact of this reductionist approach to professionalism and the practice of medicine.77 Although this argument is more complex than can be conveyed in this space, the disconnect of understanding process solely by way of outcome, particularly when addressing phenomena as qualitative as thinking and reasoning, presents an enduring challenge to this work, one in need of ongoing consideration. Finally, a lasting and complex tension in this scholarship involves consideration of the locus of change as residing with the individual as opposed to the larger system. Dan Ariely makes the point, in his bestseller, that we are not just irrational but “predictably irrational”—we misjudge, we employ biases and heuristics, we fill in patterns, and we are tripped up by cognitive illusions, not just sporadically but in similar and highly predictable ways.78 Thus, to treat the locus of this issue as if it is one person’s failings, largely misses the point and, more so, the opportunity to understand how thinking errors are predisposed at the systems level. In discussing the correlates of good intuitive judgments, Kahneman advised, “Don’t look inside, don’t look at the person, look at the environment.”79 In contemplating the environment of diagnostic decisionmaking, we might consider, for example, opportunities for calibration. A central metacognitive construct, calibration involves assessing the degree to which confidence judgments of perceived accuracy correlate with objective information on accuracy;80 this operation requires, as one might imagine, actual information on accuracy. Without opportunities for feedback, confidence remains uncalibrated, and the same mistaken assessment might be made time and again, as a pattern is recognized as similar to a prior instance, without an appreciation that the prior judgment, now repeated

392

The Ethical Imperative to Think about Thinking anew, was wrong. Autopsies allow for feedback and calibration, as do any systems that enable subsequent data on proven diagnoses.81,82 In a 2009 Archives of Internal Medicine study analyzing 583 physician-reported diagnostic errors, Gordon Schiff et al. lament the decline in autopsies and the lost opportunity for feedback, noting, “we are more often literally and figuratively burying our mistakes. Highlighting diagnostic error cases can help remind leaders of health care institutions of their responsibility to foster conditions that will better address and minimize the occurrence and consequences of errors that might otherwise have remained hidden.”83 Yet even though structural and individual components of decisionmaking may be abstracted as residing in tension, naturally they remain irrevocably intertwined. As we contemplate augmenting system-wide opportunities for feedback, we must also prioritize individual training in critical thinking, so that the available information may be used to calibrate confidence, gauge certainty and uncertainty, and assess what is known and, as importantly, what is not. As we extol the science of medicine, we must remain mindful of the art, uncertainty, and imperfection of medical thinking, and the consequent need to systematically revisit and refine our judgments. As the great American philosopher and pragmatist John Dewey might advise, we must allow ourselves to be comfortable with uncertainty, ambiguity, and the ubiquity of contingencies; we must avoid premature moral closure and, by extension, premature diagnostic closure.84,85 Thus, in the same breath that we emphasize sound structures and systems, we must be careful not to downplay the centrality of personal responsibility, and the vital importance of acknowledging and addressing the possibility of flaws in our judgments. “Whatever the system of care,” writes Fins, “the quality of its delivery hinges upon individuals, what he or she intuits and knows. . . . Each of us plays a part of a larger process, of course, but each of us needs to make our own singular contributions.”86 This inherent tension between individual and system will not be solved with a privileging of one over the other; rather, we need systemic reforms in tandem with personal responsibility. Finally, as we look to translate personal and professional responsibility into action, we must acknowledge the extent to which clear paths to progress, for those inclined to take corrective action, remain frustratingly hampered. Although we have viable avenues that might counter these cognitive constraints on medical decisions, the continued scarcity of rigorous research elucidating how decisions are made, and evaluating which of the proposed strategies might best address persistent thinking errors, has left unacceptably vague how best to proceed.87 For individual physicians eager to take curative action, there may yet be interim options. Developing a deeper appreciation and understanding of the cognitive facets of medical judgment and decisionmaking, and exploring metacognitive approaches to improving critical thinking in medicine outlined by Croskerry88 and Groopman,89 among others, may be a reasonable place to start. With that baseline familiarity, other steps might include participation in quality-assurance programs, earning continuing medical education (CME) credits in areas related to medical error, and participating in Joint Commission (JCAHO)–mandated hospital programs related to medical errors. Furthermore, to the extent that concerned readers may be in a position to influence the medical education curriculum, shape the research agenda, or persuade leadership of the import of this issue, these system-directed efforts may hold the key to breaking the impasse and dispelling

393

Meredith Stark and Joseph J. Fins the inertia. Harnessing the untapped, collective power of the medical profession as a moral community, as Ed Pellegrino suggested decades ago,90 may be just the remedy for our shared ills, and just what is needed to compel change. Concluding with Optimism (Itself a Cognitive Bias) To conclude with a glimmer of hope, though the picture has remained fairly bleak, early indications suggest that attention to diagnostic error may be on the rise.91,92 A recent 2013 systematic review, published in Annals of Internal Medicine, assembled the studies to date of patient safety initiatives targeting diagnostic error, identifying 109 studies over 46 years (1966–2012).93 Although studies of interventions aimed at reducing cognitive sources of error were included in the review, overall most of the studies identified were small, and only 14 were randomized trials, leading the authors to advocate for more rigorous research going forward.94 We echo these calls and urge attention to thinking errors in diagnostic reasoning—and throughout medical decisionmaking—predicated on critical arguments of patient safety as well as on a moral and professional rationale. The time has indeed come for the systematic incorporation of tools, constructs, and insights from cognitive sciences, in particular metacognition, into medical decisionmaking and, more broadly, into bioethics.95 Croskerry titled one of his early articles in Academic Emergency Medicine “The Cognitive Imperative: Thinking about How We Think.”96 More than a decade later, with scant evidence of change, we believe that steadfast attention to cognitive and metacognitive solutions for thinking errors in medicine has evolved to become an ethical imperative as well. Whether such assertions are effective in spurring renewed attention and compelling overdue progress remains to be seen. Notes 1. de la Rochefoucauld F. Maxim 89; as quoted in Croskerry P, Nimmo GR. Better clinical decision making and reducing diagnostic error. The Journal of the Royal College of Physicians of Edinburgh 2011;41(2):155–62, at 155. 2. Gallagher TH, Waterman AD, Ebers AG, Fraser VJ, Levinson W. Patients and physicians’ attitudes regarding the disclosure of medical errors. JAMA 2003;289(8):1001–7. 3. Bosk CL. Forgive and Remember: Managing Medical Failure. Chicago: University of Chicago Press; 1979. 4. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Archives of Internal Medicine 2005;165(13):1493–9. 5. Graber ML, Carlson B. Diagnostic error: The hidden epidemic. Physician Executive 2011;37(6):12–19. 6. ABIM Foundation, ACP-ASIM Foundation, European Federation of Internal Medicine. Medical professionalism in the new millennium: A physician charter. Annals of Internal Medicine 2002;136(3):243–6. 7. Institute of Medicine. To Err Is Human: Building a Safer Health System—Institute of Medicine; available at http://www.iom.edu/Reports/1999/to-err-is-human-building-a-safer-health-system.aspx (last accessed 16 July 2013). 8. Wachter RM. Why diagnostic errors don’t get any respect—and what can be done about them. Health Affairs 2010;29(9):1605–10. 9. Groopman J. How Doctors Think. New York: Houghton Mifflin; 2007. 10. Croskerry P. A universal model of diagnostic reasoning. Academic Medicine: Journal of the Association of American Medical Colleges 2009;84(8):1022–8. 11. Croskerry P. From mindless to mindful practice—Cognitive bias and clinical decision making. New England Journal of Medicine 2013;368(26):2445–8. 12. Pelaccia T, Tardif J, Trilby E, Charlin B. An analysis of clinical reasoning through a recent and comprehensive approach: The dual-process theory. Medical Education Online 2011;16:5890.

394

The Ethical Imperative to Think about Thinking 13. Berner ES, Graber ML. Overconfidence as a cause of diagnostic error in medicine. The American Journal of Medicine 2008;121(5):S2–S23. 14. See note 7, Institute of Medicine. 15. See note 8, Wachter 2010. 16. See note 8, Wachter 2010, at 1605. 17. Newman-Toker DE, Pronovost PJ. Diagnostic errors—The next frontier for patient safety. JAMA 2009;301(10):1060–2, at 1060. 18. See note 4, Graber et al. 2005, at 1493. 19. Saber Tehrani AS, Lee HW, Mathews SC, Shore A, Makary MA, Pronovost et al. 25-year summary of US malpractice claims for diagnostic errors 1986–2010: An analysis from the National Practitioner Data Bank. BMJ Quality & Safety 2013; available at http://qualitysafety.bmj.com/content/ early/2013/03/27/bmjqs-2012-001550 (last accessed 21 July 2013). 20. See note 19, Saber Tehrani et al. 2013, at 1. 21. See note 5, Graber, Carlson 2011. 22. Kahneman D. Thinking Fast and Slow. New York: Farrar, Straus and Giroux; 2011. 23. Tversky A, Kahneman D. Judgment under uncertainty: Heuristics and biases. Science 1974;185:1124–31. 24. See note 22, Kahneman 2011. 25. Stanovich KE, West RF. Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences 2000;23(5):645–65. 26. See note 22, Kahneman 2011. 27. See note 10, Croskerry 2009. 28. See note 12, Pelaccia et al. 2011. 29. See note 13, Berner, Graber 2008. 30. Croskerry P. The cognitive imperative: Thinking about how we think. Academic Emergency Medicine 2000;7(11):1223–31. 31. Croskerry P, Norman G. Overconfidence in clinical decision making. The American Journal of Medicine 2008;121(5 Suppl):S24–S29. 32. See note 31, Croskerry, Norman 2008, at S28. 33. See note 10, Croskerry 2009. 34. See note 10, Croskerry 2009, at 1026. 35. Djulbegovic B, Hozo I, Beckstead J, Tsalatsanis A, Pauker SG. Dual processing model of medical decision-making. BMC Medical Informatics and Decision Making 2012;12(1):94. 36. Marcum JA. An integrated model of clinical reasoning: Dual-process theory of cognition and metacognition. Journal of Evaluation in Clinical Practice 2012;18(5):954–61. 37. See note 11, Croskerry 2013. 38. Dunlosky J, Metcalfe J. Metacognition. Beverly Hills, CA: SAGE Publications; 2009. 39. See note 11, Croskerry 2013, at 2447–8. 40. See note 31, Croskerry, Norman 2008. 41. See note 1, Croskerry, Nimmo 2011. 42. See note 10, Croskerry 2009. 43. See note 31, Croskerry, Norman 2008. 44. Norman G. Dual processing and diagnostic errors. Advances in Health Sciences Education: Theory and Practice 2009;14(1 Suppl):37–49. 45. Mamede S, van Gog T, van den Berge K, Rikers RMJP, van Saase JLCM, van Guldener C, Schmidt HG. Effect of availability bias and reflective reasoning on diagnostic accuracy among internal medicine residents. JAMA 2010;304(11):1198–203. 46. Bing-You RG, Trowbridge RL. Why medical educators may be failing at feedback. JAMA 2009;302(12):1330–1. 47. See note 12, Pelaccia et al. 2011. 48. See note 4, Graber et al. 2005. 49. Rudolph JW, Morrison JB. Sidestepping superstitious learning, ambiguity, and other roadblocks: A feedback model of diagnostic problem solving. The American Journal of Medicine 2008;121(5 Suppl): S34–S37. 50. See note 46, Bing-You, Trowbridge 2009. 51. Smith KA. To keep an incessant watch. Academic Emergency Medicine 2011;18(5):545–8. 52. See note 8, Wachter 2010. 53. See note 17, Newman-Toker, Pronovost 2009. 54. See note 5, Graber, Carlson 2011.

395

Meredith Stark and Joseph J. Fins 55. Schiff GD, Hasan O, Kim S, Abrams R, Cosby K, Lambert BL, et al. Diagnostic error in medicine: Analysis of 583 physician-reported errors. Archives of Internal Medicine 2009;169(20):1881–7. 56. See note 8, Wachter 2010, at 1607. 57. Groopman J. What’s missing in medical thinking. Bulletin of the American Academy 2010;53–8, at 55. 58. Beauchamp TL, Childress JF. Principles of Biomedical Ethics. New York: Oxford University Press; 2001. 59. Tyson P. The Hippocratic Oath today. pbs.org NOVA; available at http://www.pbs.org/wgbh/ nova/body/hippocratic-oath-today.html (last accessed 26 July 2013). 60. Pellegrino ED. Toward an expanded medical ethics: The Hippocratic ethic revisited. In: Veatch RM, ed. Cross Cultural Perspectives in Medical Ethics. Boston: Jones and Bartlett; 2000;41–53, at 48. 61. Pellegrino ED. The medical profession as a moral community. Bulletin of the New York Academy of Medicine 1990;66(3):221–32, at 221. 62. See note 61, Pellegrino 1990, at 230. 63. See note 11, Croskerry 2013. 64. See note 9, Groopman 2007. 65. See note 30, Croskerry 2000. 66. Jonsen A. The Birth of Bioethics. New York: Oxford University Press; 1998, at 7. 67. Kaplan B. Evaluating informatics applications—Clinical Decision Support Systems literature review. International Journal of Medical Informatics 2001;64(1):15–37. 68. See note 19, Saber Tehrani et al. 2013. 69. See note 19, Saber Tehrani et al. 2013. 70. See note 4, Graber et al. 2005. 71. See note 8, Wachter 2010. 72. See note 35, Djulbegovic et al. 2012. 73. See note 36, Marcum 2012. 74. See note 10, Croskerry 2009. 75. See note 17, Newman-Toker, Pronovost 2009, at 1061. 76. Rawls J. A Theory of Justice. Cambridge, MA: Belknap; 1971. 77. Fins JJ, Pohl B, Doukas DJ. In praise of the humanities in academic medicine: Values, metrics, and ethics in uncertain times. Cambridge Quarterly of Healthcare Ethics 2013;22(4):355–64, at 360. 78. Ariely D. Predictably Irrational. New York: Harper; 2009. 79. Kahneman D. The marvels and flaws of intuitive thinking: Edge master class 2011; available at http://edge.org/conversation/the-marvels-and-flaws-of-intuitive-thinking (last accessed 18 July 2013). 80. See note 38, Dunlosky, Metcalfe 2009. 81. See note 46, Bing-You, Trowbridge 2009. 82. See note 51, Smith 2011. 83. See note 55, Schiff et al. 2009, at 1886. 84. Fins JJ. From desk to bedside: Profiles in bioethics. American Society for Bioethics and Humanities; available at http://www.asbh.org/uploads/files/ASBHPresidentialTalk2011final.pdf (last accessed 14 Aug 2013). 85. Dewey J. How We Think. Boston: DC Heath; 1910 (original publication). Dover Edition, 1997; available at http://books.google.com/books/about/How_We_Think.html?id=zcvgXWIpaiMC (last accessed 22 Aug 2013). 86. Fins JJ. A Palliative Ethic of Care: Clinical Wisdom at Life’s End. Sudbury, MA: Jones and Bartlett; 2006, at xxi. 87. See note 8, Wachter 2010. 88. See note 11, Croskerry 2013. 89. See note 9, Groopman 2007. 90. See note 61, Pellegrino 1990. 91. See note 5, Graber, Carlson 2011. 92. See note 17, Newman-Toker, Pronovost 2009. 93. McDonald KM, Matesic B, Contopoulos-Ioannidis DG, Lonhart J, Schmidt E, Pineda N, Ioannidis JP. Patient safety strategies targeted at diagnostic errors: A systematic review. Annals of Internal Medicine 2013;158(5 Part 2):381–9. 94. See note 93, McDonald et al. 2013. 95. Stark M. Reconciling bioethics with health care strategies born of behavioral economics and psychology. The American Journal of Bioethics 2012;12(2):28–30. 96. See note 30, Croskerry 2000.

396

The ethical imperative to think about thinking - diagnostics, metacognition, and medical professionalism.

While the medical ethics literature has well explored the harm to patients, families, and the integrity of the profession in failing to disclose medic...
84KB Sizes 2 Downloads 3 Views