538701

research-article2014

CPJXXX10.1177/0009922814538701Clinical PediatricsZenlea et al

Article

Trainee and Program Director Perceptions of Quality Improvement and Patient Safety Education: Preparing for the Next Accreditation System

Clinical Pediatrics 2014, Vol. 53(13) 1248­–1254 © The Author(s) 2014 Reprints and permissions: sagepub.com/journalsPermissions.nav DOI: 10.1177/0009922814538701 cpj.sagepub.com

Ian S. Zenlea, MD, MPH1,2, Amy Billett, MD1,2,3, Melissa Hazen, MD1,2, Daniel B. Herrick, BA1, Mari M. Nakamura MD, MPH1,2, Kathy J. Jenkins, MD, MPH1,2, Alan D. Woolf, MD, MPH1,2, and Jennifer C. Kesselheim, MD, Med1,2,3

Abstract Objective. To assess the current state of quality improvement and patient safety (QIPS) education at a large teaching hospital. Methods. We surveyed 429 trainees (138 residents, 291 clinical fellows) and 38 program directors (PDs; 2 were PDs of >1 program) from 39 Accreditation Council for Graduate Medical Education–accredited training programs. Results. Twenty-nine PDs (76.3%) and 259 trainees (60.3%) responded. Most trainees (68.8%) reported participation in projects culminating in scholarly products (39.9%) or clinical innovations (44%). Most PDs reported that teaching (88.9%) and project supervision (83.3%) are performed by expert faculty. Nearly half of the PDs (45.8%) and trainees (49.6%) perceived project-based learning to be of equal value to formal curricula. Compared with trainees, a greater proportion of PDs reported needs for funding for projects, teaching faculty to provide mentorship, and faculty development (P < .05). Conclusions. Providing additional financial, administrative, and operational support could enhance the value of curricula and projects. Developing expert teaching faculty is paramount. Keywords medical education, quality improvement, patient safety, program director, trainee

Introduction Teaching quality improvement and patient safety (QIPS) in medical education has become a national priority.1 The Association of American Medical Colleges (AAMC) endorses the integration of QIPS into the full continuum of medical education from medical school through postgraduate medical training.2 Evaluating physician competency in QIPS is also a focus of the Next Accreditation System (NAS), which was developed by the Accreditation Council for Graduate Medical Education (ACGME) and implemented in July 2013; Pediatrics is a specialty that has been an early adopter of the NAS.3 The Systems-Based Practice competencies within the NAS establish the expectation that trainees achieve proficiency in advocating for optimal patient care delivery systems, identifying medical errors, and working in interprofessional teams to enhance patient safety and care quality.4 As part of the NAS, the ACGME has established the Clinical Learning Environment

Review (CLER) program, which uses interviews with trainees, faculty, GME leadership, nursing staff, and hospital leadership to assess the graduate medical education learning environment of each institution, including efforts to engage trainees in 6 focus areas of health care quality and safety.3,5 In mid-2012, leaders in the Boston Children’s Hospital (BCH) Program for Patient Safety and Quality, which provides leadership oversight and coordination for activities involving patient safety and quality within BCH, and the BCH GME Office recognized the need to better connect institutional 1

Boston Children’s Hospital, Boston, MA, USA Harvard Medical School, Boston, MA, USA 3 Dana-Farber/Children’s Hospital Cancer Center, Boston, MA, USA 2

Corresponding Author: Ian S. Zenlea, Division of Endocrinology, Boston Children’s Hospital, 333 Longwood Avenue, 6th Floor, Boston, MA 02115, USA. Email: [email protected]

Downloaded from cpj.sagepub.com at CENTRAL MICHIGAN UNIV on December 26, 2014

1249

Zenlea et al experts in QIPS with the community of trainees and training program directors (PDs). In response, they formed a jointly sponsored committee with diverse representation including trainees, faculty educators, and QIPS experts. In partnership with the hospital’s 39 ACGME-accredited training programs, the committee is charged with developing, implementing, and assessing QIPS education for residents and fellows at BCH. An initial priority of this committee was to assess the current state of QIPS education by surveying trainees and PDs across the institution. Our aims were to assess (a) the value of formal QIPS curricula and experiential projects, (b) the need for additional formal curricula and support for experiential projects to enhance educational value, and (c) outcomes related to QIPS training, including scholarly products or clinical innovations.

Methods Study Population Eligible subjects were identified with the assistance of the GME Office and included 429 trainees (138 residents and 291 fellows) and 38 PDs from the 39 ACGMEaccredited training programs at BCH (2 were PDs of >1 program).

Surveys Separate surveys were developed for the PDs and trainees, but these were similar in content. The surveys for the trainees and PDs contained 35 and 24 items, respectively, that fell into 4 domains: formal QIPS curriculum, QIPS projects, outcomes of QIPS training, and demographic items. The surveys were pilot-tested among 5 members of each respective target population. The surveys were then revised to enhance item clarity and ease of response. Pilot participants were ineligible to participate in the final survey.

Survey Administration Surveys were fielded electronically via a link distributed by e-mail in May 2013.6 Survey completion was voluntary and anonymous. Three reminders were sent over the 4-week study period to encourage participation. Trainees who completed the survey were eligible for a raffle to win 1 of 6 $25 gift cards. Survey completion by PDs was not incentivized. The study was deemed quality improvement and therefore exempt from review by the BCH Institutional Review Board.

Statistical Analyses Descriptive statistics were calculated as proportions and means with standard deviations (SDs) as appropriate. Within-group and between-group differences were analyzed using the Student’s t test or Kruskal–Wallis 1-way analysis of variance for normally and nonnormally distributed continuous variables, respectively. The χ2 test was used to compare the proportions of binary variables between groups. A P value 20 trainees 49 (18.9)  Unknown 82 (31.7) Specialty   General pediatrics 23 (8.9)   Pediatric medical 104 (40.2) specialty   Pediatric general 3 (1.2) surgery   Pediatric surgical 16 (6.2) subspecialty  Unknown 113 (43.6) Trainee Postgraduate year (PGY)  PGY-1 10 (3.9)  PGY-2 20 (7.7)  PGY-3 17 (6.6)  PGY-4 19 (7.3)  PGY-5 29 (11.2)   PGY-6 or greater 82 (31.7) 82 (31.7)  Unknown Time since PD completed training   1-3 years   4-5 years   6-10 years   >10 years  Unknown

46.5 (7.9)d 10 (34.5) 10 (34.5) 9 (31.0) 2 (6.9) 15 (51.7) 12 (41.4) 7 (24.1) 14 (48.3) 1 (3.5) 7 (24.1) 0 (0.0) 18 (62.1) 0 (0.0) 2 (6.9)

Topic Communication between team members Teamwork Models for quality improvement Human factors and safety Disclosure of a medical error Culture of safety Fundamentals of patient safety Root cause analysis and systems analysis a

n = 157. n = 27. c P < .05. b

taught most effectively and root cause analysis and systems analysis are addressed least effectively. Compared with PDs, a greater proportion of trainees reported that disclosure of medical errors was addressed effectively (n = 117, 74.5% vs n = 14, 51.9%; P = .05).

9 (31.0)               2 (6.9) 2 (6.9) 6 (20.7) 12 (41.4) 7 (24.1)

a

n = 259. n = 29. c n = 167. d n = 17. b

We received responses from 157 trainees and 27 PDs regarding how effectively various topics are addressed in their curriculum (Table 2). Both trainees and PDs reported that communication between team members is

Experiential Learning and Hands-on QIPS Projects Program directors reported that QIPS projects are supervised most commonly by faculty with expertise in the subject area (n = 20, 83.3%), while a modest proportion reported that faculty with little or no expertise in the subject area also supervise the projects (n = 7, 29.2%). According to PDs, they or associate program directors supervise the QIPS projects in 20.8% (n = 5) and 8.3% (n = 2) of cases, respectively. Most trainees (68.8%, n = 139) reported that they have participated in a QIPS project. Almost half of PDs and trainees reported that projectbased, experiential learning was more valuable than their formal didactic curriculum (n = 47, 40.2% and n = 10, 41.7%, respectively); a similar proportion reported that experiential learning was of equal value to the formal curriculum (n = 11, 45.8% and n = 58, 49.6%, respectively). Respondents identified a variety of additional resources that potentially could augment the educational value of projects (Table 3). Compared with trainees, a greater proportion of PDs reported the need

Downloaded from cpj.sagepub.com at CENTRAL MICHIGAN UNIV on December 26, 2014

1251

Zenlea et al Table 3.  Resources to Improve the Educational Value of Projects.

Response Central list of projects Additional NonMD personnel (eg, administrative support) Support from a statistician or research methodology expert Support from a QI methodology expert Additional funding Additional teaching faculty Faculty development to increase expertise among faculty

Trainees,a n (%)

Program Directors,b n (%)

63 (57.8) 55 (50.5)

12 (54.5) 15 (68.2)

49 (45.0)

10 (45.5)

46 (42.2)

NA

44 (40.4) 29 (26.6) 15 (13.8)

16 (72.7)c 15 (68.2)c 11 (50.0)c

Abbreviations: MD, medical doctor; QI, quality improvement; NA, not assessed. a n = 109. b n = 22. c P < .05.

for funding to support projects, additional teaching faculty to provide project mentorship, and more faculty development to increase expertise (Table 3).

Importance and Outcomes of Education in QIPS Trainees and PDs agreed that QIPS education is either important or extremely important for future clinical careers (n = 161, 90.6% and n = 19, 86.4%, respectively). There were no trainees or PDs who thought QIPS education is not important. Nearly all PDs (n = 21, 95.5%) and most trainees (n = 142, 80.2%) reported QIPS are integral components of the overall culture of clinical medicine. QIPS education has led to preparation of scholarly products, such as abstracts, posters, and manuscripts, and development of clinical innovations, such as new clinical practice guidelines, by 38.8% (n = 68) and 44% (n = 77) of trainees, respectively.

Discussion We learned that nearly all PDs and trainees at our institution recognize the importance of QIPS education for career development and future clinical practice. Moreover, they value QIPS learning, gained through both formal, didactic and experiential, project-based approaches, the combination of which has been

Table 4.  Required Institute for Healthcare Improvement Open School Modules. Module PS102:1 “Understanding the Science of Human Factors” Module PS101:1 “To Err Is Human” Module QI101:1 “Errors Can Happen Anywhere—and to Anyone” Module PS103:1 “Why Are Teamwork and Communication Important?” Module PS103:2 “Basic Tools and Techniques for Teamwork and Communication” Module PS103:3 “Communication During Times of Transition” Module PS101:3 “Identifying and Reporting Errors” Module QI 102:1 “An Overview of the Model for Improvement”

suggested as the optimal format for delivering QIPS education.7 Mixed didactic and experiential learning more effectively engages learners in the practice of QIPS by involving them in the entire cycle from identifying a problem through implementation and measurement.7-10 Our survey also revealed meaningful engagement in QIPS projects by trainees: Nearly half reported that their work has culminated in academic presentations and papers or clinical innovations. However, some trainees perceive gaps in their QIPS training. Starting in 2011, our hospital’s GME Committee began to mandate completion of prescribed didactic QIPS training, comprised of online teaching modules offered through the Institute for Healthcare Improvement’s (IHI)Open School (Table 4),11 for all residents and clinical fellows. Although the Office of GME tracks each trainee’s completion of these modules and has documented 100% compliance, some trainees report not having engaged in this training. This discrepancy may be explained by a subset of trainees who simply have not yet reached the stage of their training that includes completion of the Open School modules. It is also possible that some trainees completed the modules in the past but do not remember them. Even so, our findings suggest we could make a larger and more sustained impact on trainees through longitudinal exposure and repeated emphasis on this content throughout training. In addition, respondents expressed a desire for more educational resources specific to pediatric practice. To address these educational gaps, we are exploring how best to deliver institutionwide, specialty-specific QIPS content either through additional online modules or attendance at half-day interactive workshops (Table 5). Successful models for such approaches exist both within and beyond our own institution.12,13

Downloaded from cpj.sagepub.com at CENTRAL MICHIGAN UNIV on December 26, 2014

1252

Clinical Pediatrics 53(13)

Table 5.  Proposed and Ongoing Institutional Efforts to Enhance Quality Improvement and Patient Safety (QIPS) Education. Efforts targeting faculty   •  Enhance access to faculty experts in fields of study design, biostatistics, and survey methodology   •  Optimize teaching efforts by training faculty in pedagogies likely to be effective in QIPS teaching   •  Encourage participation in medical simulation   •  Strengthen our faculty mentoring program   •  Promote existing grants programs   •  Provide institutional assistance compiling professional portfolios   •  Increase efforts from the hospital and medical school to formally recognize and reward QIPS activities Efforts targeting trainees   •  Provide administrative and operational support, input from a statistician, and guidance by a QIPS methodologist   •  Strengthen our trainee mentoring program   •  Promote existing grants programs   •  Provide opportunities for medical simulation training   •  Maintain a central list of QIPS projects   •  Enourage ateam-based approach to project implementation   •  Develop pediatric-specific educational resources, online modules, or half-day interactive workshops   •  Explicit training in standardized handoffs as part of I-PASS14,15 project

Similarly, the Office of GME monitors trainees’ engagement in at least 1 QIPS project during the training experience. However, not all PDs reported formally offering experiential QIPS projects, and not all trainees reported participation in such projects. The PD response could reflect a programmatic misperception that trainees must develop novel projects de novo rather than participate in existing divisional or departmental QIPS activities. As with the IHI Open School modules, the trainee response might be explained by the cross-sectional sampling in that more junior trainees might not yet have had the opportunity to participate. Some trainees’ experience of QIPS projects is individual and concentrated into a single rotation rather than being team-based and longitudinal.8,10,13,16 Based on our findings, we have concluded that future efforts should focus on experiential learning that is more robust, iterative, and longitudinal. These approaches may result in further value and impact of QIPS education. In addition, trainees indicated that a central list of ongoing QIPS projects would help to facilitate participation. Such projects could be enhanced by committing additional resources, including administrative support, statistical assistance, and guidance by a QIPS methodologist, all of

which were cited by trainees and PDs as being potentially helpful. The BCH Program in Patient Safety & Quality currently provides such support to many individual clinical programs. Providing access to such resources to promote new ideas for trainee QIPS projects is therefore an achievable objective. In addition, we have developed a new program, implemented in 2013, which offers small grants for QIPS projects led by trainees; 4 projects have been funded so far, and reported progress by grantees has been excellent. A practical challenge to QIPS training is the availability of dedicated faculty with the requisite QIPS experience.9,10,17 Trainee-identified gaps in the formal curriculum, including root cause analysis and disclosure of medical errors, may be amenable to educational approaches such as case-based learning and medical simulation, both of which are being actively explored at our institution. However, such strategies require experienced and knowledgeable teachers for optimal delivery.18,19 Increasing efforts from hospitals and medical schools to formally recognize and reward QIPS activities as well as recognition by national quality improvement and patient safety organizations20 could foster increased QIPS expertise and participation among both faculty and trainees. It is promising for the pool of future expert faculty that approximately 40% of trainees reported generating a scholarly product or clinical innovation from QIPS projects. Similar to other institutions,21,22 faculty who focus on QIPS activities at our institution can seek promotion through the medical school’s Clinical Expertise and Innovation track.23 For both faculty and trainees, increased mentoring and institutional assistance in compiling professional portfolios that document QIPS leadership roles, teaching responsibilities, and curricular development could assist those physician-scholars seeking academic promotion.20,24-26 To promote faculty engagement in QIPS activities, our institution offers a range of support such as consultation by faculty experts in study design, biostatistics, survey methodology, health economics, and data management; scientific and academic career mentoring; QIPS courses; and internal grant programs for QIPSrelated studies. Our findings demonstrate that faculty need improved awareness of and access to these services to nurture development of expertise in QIPS. Still, offering resources and educational opportunities will only be partially effective until faculty members themselves truly embrace QIPS expertise as essential in fulfilling their professional role, which requires fostering a culture that values and rewards success in this arena. There are several limitations that should be considered when interpreting the results of this study. First, because the study relied on PD and trainee self-report,

Downloaded from cpj.sagepub.com at CENTRAL MICHIGAN UNIV on December 26, 2014

1253

Zenlea et al the results are subject to response, recall, and social desirability biases. Additionally, because of the anonymous nature of the surveys, we were not able to evaluate variation in responses among the training programs. Last, our survey was conducted within a single institution. The strengths of our study include the good response rates from both trainees and PDs and our ability able to directly compare trainee and PD responses to identify perceived gaps in QIPS education thereby providing a more comprehensive picture regarding the current the state of QIPS education across diverse pediatric specialties. Given the reported barriers to QIPS education in the extant literature,9,10,13,17,27 we believe that our results provide information that is valuable and relevant for other large academic medical centers.

Conclusions We discovered that trainees and PDs at our hospital find QIPS training valuable and that it is yielding tangible outcomes for trainees, such as scholarly work, clinical innovation, and intent to integrate QIPS into their future career path. At the same time, we also identified various unmet QIPS educational needs. Although all trainees in our institution complete mandatory, formal QIPS curricula and participate in QIPS projects, a minority remain unaware of their QIPS exposure, indicating a need to reemphasize QIPS during the training experience. Enhanced teaching materials and additional financial, administrative, and operational support for projects could increase the educational value of curricula. Paramount to training future physicians is the need to develop additional dedicated and skilled faculty, which requires both institutional support and academic recognition. Acknowledgments The authors wish to thank the Boston Children’s Hospital Program for Patient Safety and Quality for supporting the conduct of this study. The authors wish to thank Terry Noseworthy and Nancy Dunn, CPMSM for their assistance on this project. The authors wish to thank the members of the Graduate Medical Education Committee at Boston Children’s Hospital for their thoughtful review and support.

Declaration of Conflicting Interests The author(s) declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: Dr Zenlea’s personal fees from the Risk Management Foundation of the Harvard Medical Institutions, Inc (CRICO RMF)were related to an invitation to speak on an unrelated topic. Dr Woolf’s grant funding from Gerber is unrelated to the subject of this article. The other authors have no conflicts of interest to disclose.

Funding The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: All phases of this study were supported by the Program for Patient Safety and Quality at Boston Children’s Hospital. Dr Zenlea was supported by CRICO RMF. Dr Zenlea reports personal fees from CRICO RMF outside the submitted work. Dr Woolf reports grant funding from Gerber outside of the submitted work. The other authors have nothing to disclose.

References 1. Davis NL, Davis DA, Johnson NM, et al. Aligning academic continuing medical education with quality improvement: a model for the 21st century. Acad Med. 2013;88:1437-1441. 2. Association of American Medical Colleges. Teaching for Quality: Integrating Patient Safety and Quality Improvement Across the Continuum of Medical Education: An Expert Report. Washington, DC: Association of American Medical Colleges; 2013. 3. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system—rationale and benefits. N Engl J Med. 2012;366:1051-1056. 4. Accreditation Council for Graduate Medical Education. Next Accreditation System (NAS). 2013. http://www.acgme.org/ acgmeweb/tabid/435/ProgramandInstitutionalAccreditation/ NextAccreditationSystem.aspx. Accessed November 27, 2013. 5. Weiss KB, Wagner R, Nasca TJ. Development, testing, and implementation of the ACGME Clinical Learning Environment Review (CLER) program. J Grad Med Educ. 2012;4:396-398. 6. SurveyMonkey Inc. SurveyMonkey. http://www.surveymonkey.com. Accessed October 15, 2013. 7. Ogrinc G, Headrick LA, Mutha S, Coleman MT, O’Donnell J, Miles PV. A framework for teaching medical students and residents about practice-based learning and improvement, synthesized from a literature review. Acad Med. 2003;78:748-756. 8. Tess AV, Yang JJ, Smith CC, Fawcett CM, Bates CK, Reynolds EE. Combining clinical microsystems and an experiential quality improvement curriculum to improve residency education in internal medicine. Acad Med. 2009;84:326-334. 9. Wong BM, Etchells EE, Kuper A, Levinson W, Shojania KG. Teaching quality improvement and patient safety to trainees: a systematic review. Acad Med. 2010;85: 1425-1439. 10. Wong BM, Levinson W, Shojania KG. Quality improvement in medical education: current state and future directions. Med Educ. 2012;46:107-119. 11. Institute for Healthcare Improvement. Open School. http:// www.ihi.org/offerings/ihiopenschool/Pages/default.aspx. Accessed January 24, 2014. 12. Wong BM, Goguen J, Shojania KG. Building capacity for quality: a pilot co-learning curriculum in quality

Downloaded from cpj.sagepub.com at CENTRAL MICHIGAN UNIV on December 26, 2014

1254

Clinical Pediatrics 53(13)

improvement for faculty and resident learners. J Grad Med Educ. 2013;5:689-693. 13. Philibert I, Gonzalez Del Rey JA, Lannon C, Lieh-Lai M, Weiss KB. Quality improvement skills for pediatric residents: from lecture to implementation and sustainability. Acad Pediatr. 2014;14:40-46. 14. Starmer AJ, Sectish TC, Simon DW, et al. Rates of medical errors and preventable adverse events among hospitalized children following implementation of a resident handoff bundle. JAMA. 2013;310:2262-2270. 15. Starmer AJ, Spector ND, Srivastava R, Allen AD, Landrigan CP, Sectish TC. I-PASS, a mnemonic to standardize verbal handoffs. Pediatrics. 2012;129: 201-204. 16. Neuspiel DR, Hyman D, Lane M. Quality improvement and patient safety in the pediatric ambulatory setting: current knowledge and implications for residency training. Pediatr Clin North Am. 2009;56:935-951. 17. Mann KJ, Craig MS, Moses JM. Quality improvement educational practices in pediatric residency programs: survey of pediatric program directors. Acad Pediatr. 2014;14:23-28. 18. Sukalich S, Elliott JO, Ruffner G. Teaching medical error disclosure to residents using patient-centered simulation training. Acad Med. 2014;89:136-143. 19. Stroud L, McIlroy J, Levinson W. Skills of inter nal medicine residents in disclosing medical errors: a study using standardized patients. Acad Med. 2009;84: 1803-1808.

20. Shojania KG, Levinson W. Clinicians in quality improvement: a new career pathway in academic medicine. JAMA. 2009;301:766-768. 21. University of San Francisco Department of Medicine. Systems innovation, quality improvement & patient safety portfolio. 2011. https://medicine.ucsf.edu/safety/docs/domqiportfolio-201104.pdf. Accessed March 19, 2014. 22. University of Pennsylvania Perlman School of Medicine. Committee on Appointments and Promotions Guidelines. 2013. http://somapps.med.upenn.edu/fapd/ documents/pl00030.pdf. Accessed March 19, 2014. 23. Harvard Medical School and Harvard School of Dental Medicine. Areas of excellence: criteria for appointment and promotion. 2008. http://facultypromotions.hms.harvard. edu/index.php?page=AE. Accessed December 2, 2013. 24. Levinson W, Rothman AI, Phillipson E. Creative professional activity: an additional platform for promotion of faculty. Acad Med. 2006;81:568-570. 25. Simpson D, Hafler J, Brown D, Wilkerson L. Documentation systems for educators seeking academic promotion in U.S. medical schools. Acad Med. 2004;79:783-790. 26. Taylor BB, Parekh V, Estrada CA, Schleyer A, Sharpe B. Documenting quality improvement and patient safety efforts: the quality portfolio. A statement from the Academic Hospitalist Taskforce. J Gen Intern Med. 2014;29:214-218. 27. Craig MS, Garfunkel LC, Baldwin CD, et al. Pediatric resident education in quality improvement (QI): a national survey. Acad Pediatr. 2014;14:54-61.

Downloaded from cpj.sagepub.com at CENTRAL MICHIGAN UNIV on December 26, 2014

Trainee and program director perceptions of quality improvement and patient safety education: preparing for the next accreditation system.

To assess the current state of quality improvement and patient safety (QIPS) education at a large teaching hospital...
288KB Sizes 0 Downloads 4 Views