ACGME NEWS AND V IEWS David B. Sweet, MD Jerry Vasilias, PhD Lynn Clough, PhD Felicia Davis, MHA

The Power of Collaboration: Experiences From the Educational Innovations Project and Implications for the Next Accreditation System

Furman S. McDonald, MD, MPH Eileen E. Reynolds, MD Cheryl W. O’Malley, MD Kevin T. Hinchey, MD Lynne M. Kirk, MD Andrew S. Gersoff, MD E. Benjamin Clyburn, MD John G. Frohna, MD, MPH

Abstract Background The Internal Medicine Educational Innovations Project (EIP) is a 10-year pilot project for innovating in accreditation, which involves annual reporting of information and less-restrictive requirements for a group of high-performing programs. The EIP program directors’ experiences offer insight into the benefits and challenges of innovative approaches to accreditation as the Accreditation Council for Graduate Medical Education transitions to the Next Accreditation System. Objective We assessed participating program directors’ perceptions of the EIP at the midpoint of the project’s 10year life span. Methods We conducted telephone interviews with 15 of 18 current EIP programs (83% response rate) using a 19item, open-ended, structured survey. Emerging themes were identified with content analysis. Results Respondents identified a number of the benefits from the EIP, most prominent among them,

David B. Sweet, MD, is Program Director, Internal Medicine Residency Program, Summa Health System/NEOMED, and Professor of Medicine, Northeast Ohio Medical University; Jerry Vasilias, PhD, is Executive Director, Accreditation Council for Graduate Medical Education (ACGME) Residency Review Committee for Internal Medicine; Lynn Clough, PhD, is Administrative Program Director, Internal Medicine Residency Program, Summa Health System/NEOMED, and Assistant Professor of Medicine, Northeast Ohio Medical University; Felicia Davis, MHA, is Executive Director, ACGME Residency Review Committees for Diagnostic Radiology, Emergency Medicine, and Nuclear Medicine; Furman S. McDonald, MD, MPH, is Professor of Medicine, Mayo Clinic, Rochester; Eileen E. Reynolds, MD, is Program Director, Internal Medicine Residency Program, Beth Israel Deaconess Medical Center, and Associate Professor of Medicine, Harvard Medical School; Cheryl W. O’Malley, MD, is Program Director, Internal Medicine Residency Program, Banner Good Samaritan Medical Center, and Associate Professor of Medicine, University of Arizona College of Medicine; Kevin T. Hinchey, MD, is Chief Academic Officer, Baystate Medical Center, and Associate Professor of Medicine, Tufts University School of Medicine; Lynne M. Kirk, MD, is Professor of Medicine, University of Texas Southwestern Medical Center; Andrew S. Gersoff, MD, is Program Director, Internal Medicine Residency Program, Santa Barbara Cottage Hospital, and Clinical Associate Professor of Medicine, KeckUSC School of Medicine; E. Benjamin Clyburn, MD, is Program Director,

collaboration between programs (87%, 13 of 15) and culture change around quality improvement (47%, 7 of 15). The greatest benefit for residents was training in quality improvement methods (53%, 8 of 15), enhancing those residents’ ability to become change agents in their future careers. Although the requirement for annual data reporting was identified by 60% (9 of 15) of program directors as the biggest challenge, respondents also considered it an important element for achieving progress on innovations. Program directors unanimously reported their ability to sustain innovation projects beyond the 10-year participation in EIP. Conclusions The work of EIP was not viewed as ‘‘more work,’’ but as ‘‘different work,’’ which created a new mindset of continuous quality improvement in residency training. Lessons learned offer insight into the value of collaboration and opportunities to use accreditation to foster innovation.

Editor’s Note: The ACGME News and Views section of JGME includes data reports, updates, and perspectives from the ACGME and its review committees. The decision to publish the article is made by the ACGME.

Internal Medicine Residency Program, Medical University of South Carolina, and Associate Professor of Medicine and Vice Chair of Education, Medical University of South Carolina; and John G. Frohna, MD, MPH, is Program Director, Pediatrics Residency Program, and Professor of Pediatrics and Medicine and Vice Chair of Education, University of Wisconsin-Madison. Although the authors report the results of a survey conducted by the Residency Review Committee for Internal Medicine (RRC-IM), they do not speak for the RRC-IM, and this report does not constitute an official policy statement of ACGME or any other organization with which the authors may be affiliated. Corresponding author: David B. Sweet, MD, Summa Health System, 55 Arch Street, Suite 1A, Akron, OH 44304, 330.375.3741, [email protected] DOI: http://dx.doi.org/10.4300/JGME-D-14-00155.1

Journal of Graduate Medical Education, September 2014 597

A CG M E NE WS A N D VI E W S Introduction

In the early part of the past decade, the internal medicine community was calling for a redesign of training to reflect the needs of patients and society particularly related to evolving societal mandates for quality, safety, and accountability in health care.1–4 The Educational Innovations Project (EIP) was developed by the Residency Review Committee for Internal Medicine (RRC-IM) as a significant test of an approach to foster innovation in accreditation. The project emphasized patient care and resident education, while reducing rules and requirements for high-performing programs, especially those relating to ‘‘process.’’5 The EIP began officially in 2006. Internal medicine programs with strong track records in accreditation reviews were invited to apply for inclusion in a cohort of programs that, for the next 10 years, would operate under less-detailed program requirements6 (approximately 40% fewer requirements than those applying to non-EIP programs in 2006). As a guiding principle, EIP programs were directed to use the link between high-quality education and high-quality patient care to transform the learning environment. Seventy-three programs initially submitted letters of intent. Seventeen programs (23%) were selected for entry in July 2006, and another 4 (5%) entered the EIP 1 year later. The EIP cohort encompassed programs of diverse size and geographic distribution. The cohort included 11 university (52%) and 9 community-based programs (43%) and 1 municipal program (5%; provided as online supplemental material). Each participating program received a 10-year accreditation cycle. Compliance with EIP requirements and progress in planned innovations were assessed through an annual, written report that each program provided to the RRC-IM. The purpose of this article is to report the qualitative experiences of EIP program directors at the midpoint of the 10-year project’s life span. Methods

Structured interviews were completed by the RRC-IM staff to assess EIP program directors’ experiences and to update the Accreditation Council for Graduate Medical Education (ACGME) Board of Directors on the project’s status.7 Interviews used a structured, open-ended survey developed by members of the RRC-IM’s EIP subcommittee and staff. The draft survey was distributed to EIP program directors for review and comment before administration. The final instrument contained 19 open-ended questions (provided as online supplemental material). Eighteen programs participated in the EIP at the time of the survey (3 programs had withdrawn from the project for various reasons). Program directors received e-mail invitations to participate and an electronic copy of the survey. 598 Journal of Graduate Medical Education, September 2014

Individual in-depth, structured telephone interviews, lasting approximately 45 minutes, were conducted by RRC-IM staff between November 2011 and January 2012. Interviews were completed with 15 of 18 EIP program directors (83% response rate). The RRC-IM staff completed the preliminary analysis. Responses were deidentified and content analysis was performed to identify emerging themes, with summary statistics generated to report frequency of responses by common themes. We distributed the preliminary results to all EIP program directors, and used member check to allow the entire cohort to reach consensus on overarching themes to validate the findings.8 Results

Common themes are summarized in T A B L E 1 and are described below with results and direct quotes from program directors to provide additional insights into experiences reported. Biggest Benefit to Participating in the EIP Two common themes emerged around the benefits of EIP participation: (1) the power of collaboration, and (2) a culture change within the programs’ sponsoring institutions. Power of Collaboration Most (87%, 13 of 15) of the program directors reported that collaboration with a core group of other programs was the biggest benefit of being in the EIP. The EIP conditions of participation required program leaders to meet annually in conjunction with the spring Association of Program Directors in Internal Medicine conference and to provide the internal medicine community with an update on their educational innovations and outcomes. Although not required, most programs also sent representatives to meet at the fall Association of Program Directors in Internal Medicine conference. Collaboration was enhanced in 2008 when programs established a leadership council to facilitate communication with the RRC-IM and provide structure for the group. The perceived benefits of collaboration included ‘‘building relationships’’; ‘‘[providing] opportunities to learn from the other programs . . . what worked and what did not, and why’’; ‘‘promot[ing] quality improvement . . . by permitting the sharing of ideas’’; ‘‘[providing] constant motivation for improvement’’; ‘‘increas[ing] [the] rate of change and innovations’’; and ‘‘gain[ing] insight in watching programs grow.’’ Culture Change Nearly half (47%, 7 of 15) of the participants reported that EIP requirements were important tools for promoting change and comments to that effect included that ‘‘the EIP was an important tool in changing the culture of the institution,’’ that it provided ‘‘a license to innovate,’’ along with ‘‘freedom from the rules’’ that

ACGME NEWS AND V IEWS TABLE 1

Program Directors Key Perceptions of the EIP

Experience

Question

Most Frequent Responses

Benefits

What was the biggest benefit to being in EIP?

& &

How has being in EIP benefited residents in your program? How has being in EIP benefited the faculty?

& & & & &

How has being in EIP benefited your institution?

Response, No. (%), N = 15

Power of collaboration Culture change around innovations, QI, and outcomes

13 (87) 7 (47)

Curriculum and teaching in QI Scholarly activity

8 (53) 5 (33)

Improved teaching Faculty development Increased scholarly activity

10 (67) 7 (47) 4 (27)

&

Recognition/reputation Improved patient care

8 (53) 4 (27)

&

Challenges

What was the biggest challenge to being in EIP?

&

Annual reporting of tracked outcomes

9 (60)

Costs

What were the costs associated with being in EIP?

&

Additional time from staff Hired new staff Travel Information technology support and computer software and equipment

13 9 7 5

&

Yes

14 (93)

&

More

11 (73)

&

Yes

15 (100)

&

Yes

13 (87)

&

Yes

15 (100)

&

Yes

15 (100)

&

Yes

15 (100)

&

Yes

15 (100)

& & &

Resources and institutional oversight

Has being in EIP changed the resources you receive from your institution? Has being in EIP led to your program having more, less, or same oversight from your institution?

Dissemination and spread of innovations

Have you incorporated innovations from other EIP programs into your program? Have other EIP programs incorporated innovations from your program? Have other residency programs within your institution adopted innovations that you implemented into your program?

Sustainability of innovations

Do you think you will be able to continue your innovative projects after the 10-y project is complete?

Evaluate success of EIP

Do you think EIP has been a success? If you had the option of doing it all over again, would you apply to be part of EIP?

(87) (60) (47) (33)

Abbreviations: EIP, Educational Innovations Project; QI, quality improvement.

allowed for transformational changes leading to a new focus on ‘‘resident participation in quality improvement.’’ One program director stated: ‘‘[W]e are now a residency that asks the outcome question in a real way. We analyze if it [an innovation] works, and if not, we know how to fix it.’’ Respondents referred to the annual reporting requirement as the impetus for culture change with statements such as ‘‘annual reporting helped us to stay on top of our goals’’ and ‘‘we’re measured in a better way due to our accountability for outcomes.’’

participants mentioned the required curriculum and teaching in quality improvement (QI) methods as a benefit for residents. Responses included ‘‘graduates now have a [QI] skill set more advanced than some fellowships,’’ ‘‘[residents have] benefited from learning more about QI and are ‘agents of change’ in our institution,’’ ‘‘[residents have an] enhanced ability to positively impact systems of care for our patients,’’ and ‘‘[residents will] continue quality improvement focus postgraduation.’’

Benefits for Residents and Faculty

Improved Teaching and Faculty Development When asked about benefits for faculty, 67% (10 of 15) of the respondents reported teaching improved, and 47% (7 of

Curriculum and Teaching in Quality Improvement and Patient Safety More than half (53%, 8 of 15) of the

Journal of Graduate Medical Education, September 2014 599

A CG M E NE WS A N D VI E W S 15) noted that faculty development increased. Faculty also was seen as more ‘‘connected and engaged with the program,’’ and respondents noted that participation in the EIP ‘‘improved [faculty] ability and quality to teach’’ and contributed to ‘‘incredible personal growth’’ for faculty members because of ‘‘greater exchange of ideas with other EIP programs,’’ increased focus on ‘‘looking at outcomes,’’ and ‘‘free reign to grow and innovate.’’ Scholarly Activity for Residents and Faculty Thirty-three percent (5 of 15) of respondents reported that residents benefited from increased opportunities to present scholarly work at regional and national meetings, and 27% reported this benefit for faculty. This is consistent with another report9 in the literature documenting a significant increase in publications among EIP-participating programs relative to other similar non-EIP programs in the 5 years after EIP implementation. Challenges to Participating in the EIP The Challenge of Annual Reporting The most frequently mentioned challenge, reported by 60% (9 of 15) of the program directors, was completing the EIP Annual Report Form, which included tracking of program-specific outcomes. Participants indicated the challenge of completing the annual report was greatest in the initial years of the EIP. They noted that ‘‘initially, data were difficult to retrieve from the hospital,’’ but, over time, it became easier as programs established data-collection systems. The annual report form also underwent multiple revisions and was simplified and shortened by the RRC-IM using feedback from program directors regarding redundancy and clarity. Others found annual reporting ‘‘not overly burdensome’’ and noted the annual report was a ‘‘form of project management and data collection related to our ongoing initiatives and would be necessary and desirable even if we did not participate in EIP.’’ Costs Associated With EIP Participation Travel and Technology All participants reported additional costs because of participation in the EIP, and 93% (14 of 15) reported that being in EIP increased the level of resources they received from their institution. Costs resulted from additional staff time (87%, 13 of 15), hiring new staff (60%, 9 of 15), additional travel expenses (47%, 7 of 15), and expenses related to information technology support, computer software, and equipment (33%, 5 of 15). Examples of additional staff hired included hospitalists, nocturnists, PhD educators/researchers, a dedicated coordinator, a project data manager, and QI specialists. Most (80%, 12 of 15) of the program directors believed that they would not have been able to obtain the additional resources without participation in the EIP. 600 Journal of Graduate Medical Education, September 2014

Effect of EIP Participation on Institutional Oversight Mandated Oversight Most (73%, 11 of 15) of the program directors reported increased oversight from the designated institutional official (DIO) and Graduate Medical Education Committee (GMEC). That oversight was attributed to the EIP requirement for the GMEC and DIO to review the effectiveness of the program every 2 years and the need for the DIO and GMEC to review and sign off on the annual report form. A significant percentage of respondents (67%, 10 of 15) reported that their GMECs treated ‘‘concerns’’ in notification letters they received after the RRC-IM reviewed their annual reports in the same way they treated ‘‘citations’’ in non-EIP notification letters, although the RRC-IM clarified in 2010 that these ‘‘concerns’’ should be viewed differently. Dissemination of Innovation Across EIP programs Collaboration and Dissemination Every program director reported specific examples of having incorporated innovations from other EIP programs into his or her program. Additionally, all program directors reported that being an EIP program benefited other residency programs in their institution, which also implemented innovations piloted in EIP programs. Sustainability of EIP Innovations Permanent Change All participants agreed that they expect the innovative projects to continue beyond the 10year life span of the EIP. Reasons included that the projects were ‘‘engrained in internal medicine,’’ were ‘‘aligned with hospital goals,’’ and were ‘‘a part of our environment now . . . we are not going back.’’ Success of the EIP All participants agreed that the EIP was a success, and all confirmed that they would do it again. Several themes emerged from the comments. First, the requirement for annual meetings of EIP program directors led to productive collaboration and networking, evolving into ‘‘a learning community’’ that became a valued asset for innovation and lasting change. Second, the EIP demonstrated the power of educational initiatives driven by patient outcomes. Emphasis on patient outcomes resulted in a culture change focused on QI and patient safety in educational programs. It also necessitated greater resident involvement in QI and patient safety initiatives at the institutional level. Third, the EIP was viewed as a benefit to programs and to the ACGME and as a test for changes that would be implemented in the Next Accreditation System (NAS). Participants agreed the EIP was a good accreditation model, particularly for programs in good standing, because ‘‘when you have fewer site visits, it makes you less reactive, and you can think things through and make better progress.’’

ACGME NEWS AND V IEWS TABLE 2

Elements of the Educational Innovations Project (EIP) and Comparison With the Next Accreditation System (NAS)

Elements

EIP

Accreditation cycle

&

Annual reporting Site visit in 10 y EIP has 10-y life span

&

& &

&

40% less-detailed relative to standard program requirements at time of pilot Programs in ‘‘good standing’’ eligible

Requirements categorized into ‘‘core,’’ ‘‘outcome,’’ or ‘‘detail’’ Programs in good standing can innovate regarding ‘‘detail’’ requirements

&

Annual report form

&

Annual review of data elements in ADS

&

Program change information Scholarly activity Board pass rate Clinical experience data Resident survey Program-specific innovations Education outcomes Patient outcomes

&

Program change information Scholarly activity Board pass rate Clinical experience data Resident survey Faculty survey Milestones

Annual progress reviewed by GMEC Review of program effectiveness every other year

&

& &

Requirements

Reporting process/schedule Outcomes reviewed annually

&

& & & & & & &

Institutional oversight

NAS

& &

&

& & & & & &

& & &

Collaboration

& &

Annual meetings with other EIP programs Multisite collaborative projects

& &

Annual reporting Self-study visit in 10 y

Annual review of ADS update Review of program effectiveness every year CLER visits Review of residents’ involvement in institution-wide QI and patient safety Collaboration within institutions Potential for development of learning communities

Abbreviations: ADS, Accreditation Data System; GMEC, Graduate Medical Education Committee; CLER, Clinical Learning Environment Review; QI, quality improvement.

Discussion

A key benefit of the requirements of the EIP was promoting collaboration between programs, with program directors sharing successes as well as failures. Process requirements were significantly reduced, and programs were tasked with developing innovative approaches to competency-based education that linked resident education and patient care. The resulting culture change had a profound effect on programs and the individuals in them. Residents and faculty have benefited by increasingly becoming agents of change and champions for quality. It is expected that residents will take this new QI skill set into their careers after graduation, and faculty members will, likewise, be affected throughout the balance of their careers. Based on comments from the program directors, the initiative has proven to be a fertile training ground for outcomes-driven, continuous QI in educational programming and patient care. The EIP fostered collaboration between programs, extending beyond the cohort in the innovative internvention.10,11 More important, the EIP allowed the ACGME to test key elements of the NAS,12 including annual reporting, lessdetailed requirements, focus on QI and outcomes, and longer intervals between accreditation site visits (T A B L E 2). The

experience of participating in the EIP also prepared programs for the structure and mind-set of annual reporting and monitoring in the NAS. Annual reporting has become part of the residency workflow and is viewed as a strategy for continual monitoring of QI and effectiveness of programs. One participant summed it up as ‘‘[You] can’t tell the difference between EIP and regular residency programs now [with NAS]. [The] EIP maybe got us to the threshold, but we’re past it now, and ACGME has moved in the direction of EIP.’’ Our study has several limitations. We used content analysis of open-ended responses to identify overarching themes. All responses were coded into categories, and only the most frequent responses were reported as overarching themes, which could have caused disparate viewpoints to be overlooked. Bias also could have been introduced with interviews conducted by RRC staff. Despite these limitations, our analysis of programs’ experience in the EIP offers insight into the experience of program directors grappling with the challenges of changing the work of residency training that others may experience in NAS. The overwhelming consensus of the group was the value of collaboration during times of change and implementation of educational innovations. Journal of Graduate Medical Education, September 2014 601

A CG M E NE WS A N D VI E W S Through participation in the EIP, program directors developed a new Community of Practice with extensive collaboration promoting refinement and spread of innovations.13 This collaboration, along with accountability and opportunities to innovate, helped to empower EIP programs and contributed to positive and sometimes unexpected outcomes. Conclusion

The experience of programs participating in the EIP offers the broader medical education community insight into the benefits and difficulties of a new approach focused on reducing burden and on outcomes, along with strategies to promote further innovation within the NAS. References 1 Fitzgibbons J, Bordley D, Berkowitz L, Miller B, Henderson M; Association of Program Directors in Internal Medicine. Redesigning residency education in internal medicine: a position paper from the Association of Program Directors in Internal Medicine. Ann Intern Med. 2006;144(12):920–926. 2 Fitzgibbons JP, Meyers FJ. Redesigning training for internal medicine. Ann Intern Med. 2006;145(11):865–866. 3 Meyers F, Weinberger S, Fitzgibbons J, Glassroth J, Duffy F, Clayton C, et al. Redesigning residency training in internal medicine: the consensus report of the Alliance for Academic Internal Medicine Education Redesign Task Force. Acad Med. 2007;82(12):1211–1219.

602 Journal of Graduate Medical Education, September 2014

4 Gorroll AH, Sirio C, Duffy FD, LeBlond RF, Alguire P, Blackwell TA, et al. A new model for accreditation of residency programs in internal medicine. Ann Intern Med. 2004;140(11):902–909. 5 Mladenovic J, Bush R, Frohna J. Internal medicine’s Educational Innovations Project: improving health care and learning. Am J Med. 2009;122(4):398–404. 6 Accreditation Council for Graduate Medical Education. ACGME program requirement for graduate medical education in internal medicine Educational Innovations Project (EIP). http://www.acgme.org/acgmeweb/ Portals/0/PFAssets/ProgramRequirements/140_EIP_PR205.pdf. Effective 2005. Updated 2011. Accessed January 30, 2014. 7 Accreditation Council for Graduate Medical Education. ACGME policies and procedures. http://www.acgme.org/acgmeweb/Portals/0/PDFs/ ab_ACGMEPoliciesProcedures.pdf. 2013. Effective January 31, 2014. Accessed January 31, 2014. 8 Cohen D, Crabtree B; Robert Wood Johnson Foundation. Qualitative research guidelines project. July 2006. http://www.qualres.org/ HomeMemb-3696.html. Accessed June 6, 2014. 9 Thomas KG, Halvorsen AJ, West CP, Warm EJ, Vasilias J, Reynolds EE, et al. Educational Innovations Project—program participation and education publications. Am J Med. 2013;126(10):931–936. 10 Francis M, Thomas K, Langan M, Smith A, Drake S, Gwisdalla KL, et al. Clinic design, key practice metrics, and resident satisfaction in internal medicine continuity clinics: findings of the Educational Innovations Project Ambulatory Collaborative. J Grad Med Educ. 2014;5(2):249–255. 11 Meade LB, Caverzagie KJ, Swing SR, Jones RR, O’Malley CW, Yamazaki K, et al. Playing with curricular Milestones in the educational sandbox: Q-sort results from an internal medicine educational collaborative. Acad Med. 2013;88(8):1142–1148. 12 Nasca T, Philibert I, Brigham T, Flynn T. The next GME accreditation system—rationale and benefits. N Engl J Med. 2012;366(11):1051–1056. 13 Warm EJ, Logio LS, Pereira A, Buranosky R, McNeill D. The Educational Innovations Project: a community of practice. Am J Med. 2013;126(12):1145–1149.

The Power of Collaboration: Experiences From the Educational Innovations Project and Implications for the Next Accreditation System.

The Internal Medicine Educational Innovations Project (EIP) is a 10-year pilot project for innovating in accreditation, which involves annual reportin...
91KB Sizes 0 Downloads 8 Views