AAIM is the largest academically focused specialty organization representing departments of internal medicine at medical schools and teaching hospitals in the United States and Canada. As a consortium of five organizations, AAIM represents department chairs and chiefs; clerkship, residency, and fellowship program directors; division chiefs; and academic and business administrators as well as other faculty and staff in departments of internal medicine and their divisions.

Usefulness of the ACGME Resident Survey: A View from Internal Medicine Program Directors Michael Adams, MD,a Lisa L. Willett, MD,b Sandhya Wahi-Gururaj, MD, MPH,c Andrew J. Halvorsen, MS,d Steven V. Angus, MDe a

Department of Medicine, Medstar Georgetown University Hospital, Washington, DC; bUniversity of Alabama at Birmingham School of Medicine; cUniversity of Nevada School of Medicine, Las Vegas; dOffice of Educational Innovations, Internal Medicine Residency, Mayo Clinic, Rochester, Minn; eUniversity of Connecticut School of Medicine, Farmington, Conn.

On July 1, 2013, the Accreditation Council for Graduate Medical Education (ACGME) began implementation of the Next Accreditation System (NAS).1 Annual data reporting for residency programs in the NAS includes 9 reporting components. In addition to 3 new components (Milestones reporting, Clinical Learning Environment Review, and the annual faculty survey), the ACGME recently made substantial changes to the annual resident survey.2 Programs’ residents will no longer receive the same survey; rather, surveys will vary depending upon residents’ previous survey responses and level of education. Program directors will no longer have the ability to preview the survey in order to assist residents in their understanding of the questions asked, nor will the aggregate reports program directors receive provide the specific questions residents were asked. In response to these changes, program directors have expressed concerns about the changes to the survey instrument and the perceived weight placed on the survey for accreditation decisions.

Funding: None. Conflict of Interest: None. Authorship: The manuscript represents original work and all authors meet criteria for authorship, including having access to the data, accepting responsibility for the scientific content of the manuscript, and a significant role in writing the manuscript. Requests for reprints should be addressed to Michael Adams, MD, Department of Medicine, Georgetown University Hospital, Washington, DC 20007. E-mail address: [email protected]

Previous to the NAS, program directors in multiple specialties (General Surgery, OB/GYN, Psychiatry) questioned the validity and accuracy of the ACGME resident survey.3-6 Specific areas of perceived misinterpretation on the survey include the definitions of “service” and “education,”3,7-9 “sufficient supervision” by attendings,3 perceived compliance with research requirements,10 and accuracy in reporting of duty hours.5,11 Program directors and specialty organizations encouraged ACGME to scientifically evaluate the resident survey and to make changes to improve the validity of the instrument. In response to these concerns, the ACGME and others had analyzed the ACGME survey (before the 2013 changes) and found the survey to have good overall internal consistency12 and be a reliable and valid tool to evaluate duty hours.13 This study set out to evaluate Internal Medicine program directors’ perceptions of the new ACGME survey and to help identify specific areas of perceived or real inconsistency or areas allowing for misinterpretation.

METHODS The Association of Program Directors in Internal Medicine (APDIM) Survey Committee conducts an annual survey of member programs of APDIM to explore the characteristics of programs and opinions of program directors about issues important to internal medicine and residency training. Three hundred seventy

0002-9343/$ -see front matter Ó 2014 Alliance for Academic Internal Medicine. All rights reserved. http://dx.doi.org/10.1016/j.amjmed.2013.12.010

AAIM Perspectives

AAIM Perspectives

352

The American Journal of Medicine, Vol 127, No 4, April 2014

detailed in Table 4, with the definition of “service programs/program directors were sent a link to the versus education” and duty hours leading the list. survey in August 2012, representing 95.6% of all 387 One hundred sixty-three program directors (60%) accredited US training programs. The 2012 survey agreed that the survey was useful in helping them included 13 questions asking program directors’ views leverage their institution for more resources (Table 5). The of the change in the ACGME resident survey. Program most common resources program directors leveraged were directors were asked their opinions of the accuracy of the hiring hospitalists, physician new survey, their level of extenders, and administrative agreement with the changes in PERSPECTIVES VIEWPOINTS support. Pro-gram directors were the survey, how they helped prepare residents to complete  Recent changes in the Accreditation neutral about the effect of the the survey, and open-ended Council for Graduate Medical Education resident survey on their most questions assessing program (ACGME) resident survey generated recent site visit, with 70 (26%) stating there was no effect, 54 directors’ use of the survey to significant concerns from internal med(20%) stating the survey affected leverage resources, as well icine program directors about the val- the site visit negatively, and 69 as program directors’ general idity of the survey for accreditation (25%) stating it affected the site assessment of the validity and purposes. visit positively. use of the survey. The APDIM survey instruments and sum Clarification of potentially ambiguous mary files are available on survey terms may reduce misinterpretaDISCUSSION the APDIM Web site at http:// tion by residents and reassure program www.im.org/toolbox/surveys/ Our study demonstrated a sigdirectors that the survey accurately APDIMSurveyData. nificant level of concern by represents their programs. program directors about the  Program directors would welcome new format of the ACGME investigation of the survey for validity resident survey. Most program RESULTS and consistency, as the ACGME has directors disagreed with the Two hundred seventy-two of carefully done in previous survey changes in the resident survey, did not believe there was suf370 (73.5%) program directors versions. ficient transparency in its use, responded to the survey. The and believe they should receive characteristics of the programs aggregate responses from their residents for each spethat responded compared with cific question, particularly if the goal is to use survey nonresponders are outlined in Table 1. Universityresults to stimulate ongoing program improvement. based, larger-sized programs were more likely to The most common negative responses by program respond. There were no regional differences in response directors about the survey were concerns about resirates, nor were American Board of Internal Medicine dents’ misinterpretation of survey questions and the pass rates or program director tenure different for inability of program directors to be able to clarify responder programs versus nonresponders. the intent of questions and specific definitions One hundred eighty-one program directors (67%) with their residents in advance. Program directors did not agree with the new practice of program-specific understood the need to not coach residents into surveys, and 202 (74%) were concerned about the providing specific answers, but felt many of the transparency of how the survey will be used to accredit questions and terms used on the survey were vague programs. Two hundred fifty-nine (95%) agreed they and open to misinterpretation. An interesting finding should receive aggregate responses to the specific was the concern that the survey had potential to be questions asked of their residents. Only a minority of misinterpreted by residents who were Englishprogram directors (111; 41%) felt that it somewhat or as-a-second-language speakers, suggesting the need very accurately represented the status of their program. for a careful review of the cultural interpretation of In terms of ease of ensuring that residents complete the the survey questions. survey, 127 (47%) found it “difficult” to ensure The concerns program directors shared about specicompletion, while 112 (41%) found it “easy.” fic survey questions focused on “service versus educaWe asked program directors for open-ended respontion,” accuracy of residents’ responses to duty-hours ses about the potential positive and negative consequestions, and interpretation of how many learners quences of the survey change. The responses are collated constitutes “too many” (Table 4). These are consistent and detailed in Tables 2 and 3. In general, program with program directors’ concerns in the previous directors perceived the change as negative, with the version of the ACGME survey and suggest a potential most common concern being misinterpretation of need for investigation of the new ACGME survey for questions by residents. The questions program directors validity and consistency. Some suggestions program believe residents most commonly misunderstand are

Adams et al Table 1

Usefulness of the ACGME Resident Survey in the NAS

353

Program Characteristics Responders (n ¼ 272)

Characteristic Program description, n (%) Community-based, university affiliated hospital University-based Community-based Military Region, n (%) Northeast South Midwest West Unincorporated territory ABIM pass rate, Mean (SD) percent Program director tenure, Mean (SD) years Program size, Mean (SD) approved positions

Nonresponders (n ¼ 115)

P-Value .02*

144 (52.9)

57 (49.6)

100 (36.8) 23 (8.5) 5 (1.8)

33 (28.7) 21 (18.3) 4 (3.5) .20*

89 74 65 41 3 86.2 6.6 68.2

(32.7) (27.2) (23.9) (15.1) (1.1) (9.5) (6.5) (39.2)

45 28 23 14 5 86.8 6.2 56.6

(39.1) (24.4) (20.0) (12.2) (4.4) (9.7) (6.9) (35.8)

.62† .59† .005†

ABIM ¼ American Board of Internal Medicine. *Fisher’s Exact Test. †Welch’s t test.

directors provided to improve the survey process included providing more details to the ACGME’s “frequently asked questions” to further explain how the ACGME derives program-specific surveys or determines a threshold of “noncompliance” on the new survey. Program directors also suggested consideration of defining terms that may be considered

Table 2 Program Directors’ Negative Comments About the Change in the ACGME Survey Comment Survey is vague/ambiguous/misinterpreted by residents PDs can’t improve program or clarify questions without knowing questions in advance Survey questions are misleading/misrepresent the program Survey carries too much weight/high stakes Survey questions not well written ACGME does not trust PDs/creates an adversarial environment The new survey implementation lacks transparency/is inconsistent with professional values Misinterpretation based upon ESL or cultural differences with IMGs Unintended consequences of ACGME survey Survey tool is not meaningful/accurate Responses are resident perceptions not reality Too long

Number of Times Cited 63 33 13 8 6 6

ambiguous, such as “sometimes” or “opportunities for scholarship.” Given that the results of the resident survey are used for accreditation, a “high stakes” decision, it is not surprising that program directors express high levels of concern about the validity of the survey. Test-based accountability—defined as the use of tests to hold individuals or institutions responsible for performance through the application of rewards or sanctions—has become a debated issue. In the past decade the US federal education system has adopted test-based accountability systems, which has generated concerns about superficial solutions and focusing efforts of excellence on the wrong targets.14 The corollary in Internal Medicine may be that program directors feel added pressure when making unpopular changes within their program, even if for the right educational reasons. Unpopular rotations may make for better physicians in the long run if educationally justified, but may be misinterpreted by trainees as providing “service versus education.” Results of the survey data can have positive implications, as a majority of program directors used

5

4 4 3 1 1

Abbreviations: ACGME ¼ Accreditation Council on Graduate Medical Education; ESL ¼ English as a second language; IMG ¼ international medical graduate; PD ¼ Program Director.

Table 3 Program Directors’ Positive Comments About the Change in the ACGME Survey Comment

Number of Times Cited

Prevents “coaching” by PDs Survey clarity better Can lead to beneficial changes to a program

7 2 1

Abbreviations: ACGME ¼ Accreditation Council on Graduate Medical Education, PD ¼ Program Director.

354

The American Journal of Medicine, Vol 127, No 4, April 2014

Table 4 Items on the Survey Most Commonly Misunderstood by Residents, According to Program Directors Item

Number of Times Cited

Service vs education Duty hours Definition of “sometimes” in survey questions Too many learners “Intimidation” misinterpreted Number of attendings of record Nonmedicine patients on service “Scholarly activity”

4 3 3 2 1 1 1 1

the survey instrument and the ramifications of the results for programs. In the past, studies in surgical education have suggested that discrepancies exist between the ACGME survey and actual duty hours.4,6 Nonetheless, the ACGME has analyzed previous versions of the resident survey carefully and found internal reliability and validity as well as good correlation between duty hours noncompliance on the ACGME survey and other areas of program noncompliance. A careful analysis of the consistency and validity of the new ACGME survey is warranted given program directors’ concerns and the perceived “high stakes” nature of how the results will be used in the NAS.

ACKNOWLEDGMENT resident survey data to secure program resources. In addition, some program directors feel the new format prevents coaching of residents, suggesting the belief that ACGME survey results in the new format will have improved validity. Despite the concerns about the survey, program directors do not seem to find it overly burdensome to have residents complete the survey, which has become notably shorter than previous versions. In conclusion, program directors’ initial reaction to changes made to the ACGME resident survey is negative, with specific concerns surrounding the proper interpretation of questions, the validity of Table 5 Institutional Resources Requested Based on Survey Responses*

Item Hiring more hospitalists Hiring physician extenders (NPs, PAs) Increased administrative staff/support Facility improvements (call rooms, lounges, etc.) Increased protected time for PD/APDs Increasing residency positions Increased salary/salary support for PD/APDs Additional resources to decrease service demands Increase in subspecialty support Improved information technology infrastructure/resources

Frequency Count

Percent of Total Frequency

63 55

38.7 33.7

44

27.0

33

20.2

30

18.4

26 15

16.0 9.2

9

5.5

6 6

3.7 3.7

Abbreviations: ACGME ¼ Accreditation Council on Graduate Medical Education; APD ¼ Associate Program Director; NP ¼ nurse practitioner; PA ¼ physician assistant; PD ¼ Program Director. *Total number who used the ACGME survey to leverage resources ¼ 163.

We are grateful for the support of the Association of program directors in Internal Medicine, members of the Survey Committee, and to the residency program directors that completed this survey. This study was supported in part by the Mayo Clinic Internal Medicine Residency Office of Educational Innovations as part of the ACGME Educational Innovations Project. The Mayo Clinic Survey Research Center provided assistance with the survey design and data collection.

References 1. Accreditation Council for Graduate Medical Education (ACGME). ACGME Next Accreditation System. Available at: http://www.acgme-nas.org. Accessed October 1, 2013. 2. Accreditation Council for Graduate Medical Education (ACGME). ACGME Data Collection System: Resident and fellow survey. Available at: http://www.acgme.org/acgmeweb/ DataCollectionSystems/ResidentFellowSurvey.aspx. Accessed October 1, 2013. 3. Balon R. The unspoken tyranny of regulatory agencies: a commentary on the ACGME Resident Survey. Acad Psychiatry. 2012;36(5):351-352. 4. Fahy BN, Todd SR, Paukert JL, Johnson ML, Bass BL. How accurate is the Accreditation Council for Graduate Medical Education (ACGME) Resident survey? Comparison between ACGME and in-house GME survey. J Surg Educ. 2010;67(6): 387-392. 5. Todd SR, Fahy BN, Paukert JL, Mersinger D, Johnson ML, Bass BL. How accurate are self-reported resident duty hours? J Surg Educ. 2010;67(2):103-107. 6. Sticca RP, Macgregor JM, Szlabick RE. Is the Accreditation Council for Graduate Medical Education (ACGME) Resident/ Fellow survey, a valid tool to assess general surgery residency programs compliance with work hours regulations? J Surg Educ. 2010;67(6):406-411. 7. Reines HD, Robinson L, Nitzchke S, Rizzo A. Defining service and education: the first step to developing the correct balance. Surgery. 2007;142(2):303-310. 8. Sanfey H, Cofer J, Hiatt JR, et al. Service or education: in the eye of the beholder. Arch Surg. 2011;146(12):1389-1395. 9. Smith DE, Johnson B, Jones Y. Service versus education, what are we talking about? J Surg Educ. 2012;69(3): 432-440. 10. Oakley SH, Crisp CC, Estanol MV, Fellner AN, Kleeman SD, Pauls RN. Attitudes and compliance with research requirements

Adams et al

Usefulness of the ACGME Resident Survey in the NAS

in OB/GYN residencies: a national survey. Gynecol Obstet Invest. 2013;75(4):275-280. 11. Chadaga SR, Keniston A, Casey D, Albert RK. Correlation between self-reported resident duty hours and time-stamped parking data. J Grad Med Educ. 2012;4(2):254-256. 12. Holt KD, Miller RS. The ACGME Resident Survey Aggregate Reports: an analysis and assessment of overall program compliance. J Grad Med Educ. 2009;1(2):327-333.

355

13. Holt KD, Miller RS, Philibert I, Heard JK, Nasca TJ. Residents’ perspectives on the learning environment: data from the Accreditation Council for Graduate Medical Education resident survey. Acad Med. 2010;85(3): 512-518. 14. Supovitz J. Is high stakes testing working? Penn Graduate School of Education. Available at: http://www.gse.upenn.edu/review/ feature/supovitz. Accessed October 1, 2013.

Usefulness of the ACGME resident survey: a view from internal medicine program directors.

Usefulness of the ACGME resident survey: a view from internal medicine program directors. - PDF Download Free
155KB Sizes 0 Downloads 0 Views