Letters to the Editor

Letters to the Editor A Closer Look at Attrition in MD–PhD Programs To the Editor: Jeffe and colleagues’1 article on MD–PhD program outcomes made excellent points about institutional investment in physician–scientist training and factors that may affect students’ program completion. However, we believe that Jeffe and colleagues have greatly overestimated the MD–PhD program attrition rate because of known coding errors in the Association of American Medical Colleges (AAMC) Student Record System (SRS) prior to 2005.

AAMC data show that 2,746 students graduated with MD–PhD degrees in the eight years after the study period (2003–2008).3 Thus, AAMC data show even more MD–PhD graduates than Jeffe and colleagues had identified as matriculants. Therefore, we infer that pre-2005 SRS database inaccuracies due to incorrect degree program status coding prevented the authors from identifying all MD–PhD matriculants and graduates for the study period. This introduced significant problems in their attrition rate calculation.

We became aware of this coding problem when the Association of MD–PhD Programs joined the AAMC in 2005. Prior to 2005, the Association of MD– PhD Programs annually collected from each program a list of matriculants. Each year this number was significantly greater than the number from SRS due to inaccurate reporting by medical school registrars.

Finally, attrition from MD–PhD programs is not the biggest factor limiting the size of the physician– scientist workforce. The absolute number of MD–PhD graduates remains small, and the percentage of physicians committed to research is shrinking.4 A much bigger factor is attrition from the pipeline at later stages of training and career.

Faced with this discrepancy, the Association of MD–PhD Programs’ Data Committee worked for several years with AAMC staff and medical school registrars to improve status coding until the numbers agreed. This also improved identification of MD– PhD graduates in SRS after 2005. The data in Table 1 of Jeffe and colleagues’ article reflect these efforts. They show that the annual attrition rate fell from 20.5% among 1995 matriculants, who graduated before 2005, to 13.1% among 2000 matriculants, who graduated after 2005. We believe that this decline reflects improved reporting of graduating MD–PhD students in SRS, not an actual change in attrition rate. The 2000 matriculant attrition rate of 13.1% is similar to the rate found in a study of 24 programs in which we participated.2

Disclosures: None reported.

Additional evidence for the coding problem’s impact is suggested by the numbers Jeffe and colleagues obtained from the SRS and matriculation questionnaire. They report that, of 2,582 matriculants from 1995 to 2000, only 1,885 had graduated with MD–PhD degrees by 2011. The average time to both degrees is eight years,1,2 but

958

Myles H. Akabas, MD, PhD Professor of physiology and biophysics and director, Medical Scientist Training Program, Albert Einstein College of Medicine, Bronx, New York; myles. [email protected].

Lawrence F. Brass, MD, PhD Professor of medicine and director, Medical Scientist Training Program, University of Pennsylvania School of Medicine, Philadelphia, Pennsylvania; brass@mail. med.upenn.edu.

References 1 Jeffe DB, Andriole DA, Wathington HD, Tai RH. Educational outcomes for students enrolled in MD–PhD programs at medical school matriculation, 1995–2000: A national cohort study. Acad Med. 2014;89:84–93. 2 Brass LF, Akabas MH, Burnley LD, Engman DM, Wiley CA, Andersen OS. Are MD–PhD programs meeting their goals? An analysis of career choices made by graduates of 24 MD–PhD programs. Acad Med. 2010;85:692–701. 3 Association of American Medical Colleges. Table 32: MD–PhD applicants, acceptees, matriculants, and graduates of U.S. medical schools by sex, 2001–2012. https://www.aamc. org/download/321542/data/2012factstable32. pdf. Accessed March 13, 2014. 4 Garrison HH, Deschamps AM. NIH research funding and early career physician scientists: Continuing challenges in the 21st century. FASEB J. 2014;28:1049–1058.

To the Editor: Although Jeffe and colleagues1 were optimistic about the percentage of MD–PhD matriculants in their study who graduated with both degrees, we found their results (a 27% attrition rate) disheartening. At programs without Medical Scientist Training Program (MSTP) funding, this causes strain on institutional budgets, and at their MSTP-funded counterparts, attrition wastes precious National Institutes of Health (NIH) resources. Additionally, students who fail to graduate with both degrees have essentially taken positions away from other deserving individuals who may well have completed the training. We acknowledge the challenges of selecting and nurturing successful MD–PhD candidates, but we offer some suggestions for supporting physician–scientist trainees along the way. MD–PhD training is not an easy path. A multitude of factors make it difficult to be a dual-degree trainee, including but not limited to the dearth of current and future funding opportunities, the length of time required to complete the degrees, the difficult transitions between medical and graduate schools, and the pressure to excel in multiple fields. Despite these challenges, a 27% attrition rate is unacceptably high. Interestingly, the study found that higher Medical College Admission Test (MCAT) scores, attendance at MSTP-funded schools, and greater planned career involvement in research were factors associated with decreased attrition rates. What can be done from an institutional standpoint at nonMSTP-funded programs to reduce attrition rates? Are the environments at these schools less supportive of MD–PhD candidates? What role do lower MCAT scores play? Should faculty and administrators at these programs raise their admissions requirements and/or performance expectations? In addition to the central role that institutions must play in curbing attrition rates, matriculated trainees have a res­ ponsibility to support fellow students. The American Physician Scientists Asso­ciation (APSA)2 is a trainee-led organization—one from which we have

Academic Medicine, Vol. 89, No. 7 / July 2014

Letters to the Editor

benefited and now are privileged to help lead—dedicated to serving the needs of future physician–scientists, from the undergraduate through the resident level. Through its regional and annual meetings, APSA provides peer support, networking, and career development opportunities, as well as mentorship by experienced physician–scientists. APSA also provides advocacy for issues concerning physician–scientist trainees, for example, by pushing for increased numbers of F30 predoctoral fellowships offered by the NIH. The findings of the study by Jeffe and colleagues also suggest that we pay particular attention to students at non-MSTP-funded insti­ tutions to improve attrition rates in the future. The responsibility to ensure the success of physician–scientists in training falls not just under the pur­ view of the institution but is also the responsibility of trainees and their col­ leagues. It is this support network that perhaps is the most critical ingredient to a blossoming career as a clinician– investigator. Disclosures: None reported. Peter N. Mittwede Fifth-year MD–PhD student, University of Mississippi Medical Center, Jackson, Mississippi; pmittwede@ umc.edu.

Evan K. Noch, MD, PhD First-year resident, Department of Neurology and Neuroscience, Weill Cornell Medical College–New York Presbyterian Hospital, New York, New York.

Michael H. Guo Fourth-year MD–PhD student, University of Florida College of Medicine, Gainesville, Florida.

References 1 Jeffe DB, Andriole DA, Wathington HD, Tai RH. Educational outcomes for stu­ dents enrolled in MD–PhD programs at medical school matriculation, 1995–2000: A national cohort study. Acad Med. 2014;89:84–93. 2 American Physician Scientists Association. http://www.physicianscientists.org/default. asp?. Accessed January 25, 2013.

In Reply to Akabas and Brass and to Mittwede et al: We thank Drs. Akabas and Brass for their interest in our work. We are aware of the issue they raised about coding errors in the AAMC SRS,1 and we took this issue into account in our research. We had first included all eligible matriculants from our previously described 1993–2000 cohort in the

Academic Medicine, Vol. 89, No. 7 / July 2014

regression models.2 To examine the reliability of observations made using the SRS indicator for degree program at graduation in identifying independent predictors of MD-only graduation (versus MD–PhD graduation), we compared results from two regression models, one using the SRS indicator and the other using the student-reported degree program at graduation on the AAMC Graduation Questionnaire (GQ). These two models yielded essentially the same results regarding the direction, overall magnitude, and statistical significance of the associations of interest. Then, to measure concordance between GQ and SRS data for degree program at graduation (MD-only versus MD–PhD) within each matriculation year, we used the McNemar test. The McNemar test indicated significant differences in concordance between the GQ and SRS degree program at graduation for matriculants in 1993 (P = .006) and 1994 (P = .031) but not for matriculants in any matriculation year from 1995 to 2000 (each P ≥ .289). Thus, we excluded all 1993 and 1994 matriculants from our study and published findings only for the 1995–2000 matriculants. Differences in attrition rates between our study and that of Brass and colleagues3 would be fully expected because of substantial differences in study design and sample selection, as we described in our publication.1 That AAMC data4 would show even more MD–PhD graduates than we identified as matriculants also would be fully expected. AAMC data included all MD–PhD program graduates who either enrolled in MD–PhD programs at matriculation or enrolled in MD–PhD programs at some point after matriculation. In our study sample, we included only those MD–PhD program graduates who reported on the Matriculating Student Questionnaire (MSQ) that they were enrolled in MD–PhD programs and who had complete data for analysis; we did not include in our study sample any MD–PhD program graduates who enrolled in MD–PhD programs at any point after matriculation, chose not to respond to the MSQ, or were missing other data for analysis. We also thank Mittwede and colleagues for their informative discussion of the potential role that the trainee-led American Physician Scientists Association may play in fostering the success of physician–scientists.

Disclosures: Funded by 2R01 GM085350 and R01 GM094535. Donna B. Jeffe, PhD Professor of medicine, Washington University School of Medicine, St. Louis, Missouri; [email protected].

Dorothy A. Andriole, MD Assistant dean for medical education and associate professor of surgery, Washington University School of Medicine, St. Louis, Missouri.

References 1 Jeffe DB, Andriole DA, Wathington HD, Tai RH. Educational outcomes for MD–PhD program matriculants: A national cohort study. Acad Med. 2014;89:84–93. 2 Jeffe DB, Andriole DA. A national cohort study of MD–PhD graduates of medical schools with and without funding from the National Institute of General Medical Sciences’ Medical Scientist Training Program. Acad Med. 2011;86:953–961. 3 Brass LF, Akabas MH, Burnley LD, Engman DM, Wiley CA, Andersen OS. Are MD–PhD programs meeting their goals? An analysis of career choices made by graduates of 24 MD– PhD programs. Acad Med. 2010;85:692–701. 4 Association of American Medical Colleges. Table 32. MD–PhD applicants, acceptees, matriculants and graduates of U.S. medical schools by sex, 2001–2012. https://www.aamc.org/download/321542/ data/2012factstable32.pdf. Accessed March 16, 2014.

Reflection in Diagnostic Reasoning: What Really Matters? To the Editor: Ilgen and colleagues1 recently compared the effects on diagnostic accuracy of automatic and analytic reasoning, the latter induced by instructions described as similar to those that we have employed in our research. In contrast with our studies (e.g., Mamede et al, 20102) and their own previous findings,1 analytic reasoning did not improve performance. We believe the instructions laid out in our previous work2 and Ilgen and colleagues’ application of them in their study substantially differ. Reflective reasoning involves, in our view, critically scrutinizing the initial diagnostic impression of a problem. This usually requires physicians to escape from the point of view created by their first hypothesis and look at the case from a different perspective. Our reflection instructions therefore request physicians to first formulate one diagnostic hypothesis and subsequently identify case findings that speak in favor of

959

A closer look at attrition in MD–PhD programs.

A closer look at attrition in MD–PhD programs. - PDF Download Free
180KB Sizes 1 Downloads 4 Views