Innovation Report

The McMaster Modular Assessment Program (McMAP): A Theoretically Grounded Work-Based Assessment System for an Emergency Medicine Residency Program Teresa Chan, MD, and Jonathan Sherbino, MD, MEd, for the McMAP Collaborators

Abstract Problem To assess resident competence, generalist programs such as emergency medicine (EM), which cover a broad content and skills base, require a substantial number of work-based assessments (WBAs) that integrate qualitative and quantitative data. Approach The McMaster Modular Assessment Program (McMAP), implemented in McMaster University’s Royal College EM residency program in 2011–2012, is a programmatic assessment system that collects and aggregates data from 42 WBA instruments aligned with EM tasks and mapped to the CanMEDS competency framework.

Problem

The shift to competency-based medical education (CBME) is ushering in a need for frequent, criterion-based, authentic assessments of learners that incorporate qualitative measures.1 These qualitative measures will play significant roles in elucidating resident performance for assessment.2 To assess resident T. Chan is assistant professor, Division of Emergency Medicine, Department of Medicine, McMaster University Michael G. DeGroote School of Medicine, Hamilton, Ontario, Canada, and student, Master of Health Professions Education Program, University of Illinois College of Medicine, Chicago, Illinois. J. Sherbino is associate professor, Division of Emergency Medicine, Department of Medicine, McMaster University Michael G. DeGroote School of Medicine, and adjunct scientist, Program for Educational Research and Development, McMaster University, Hamilton, Ontario, Canada. Correspondence should be addressed to Teresa Chan, 237 Barton St. E, Hamilton General Hospital, McMaster Clinic, Room 254, Hamilton, ON, L8L 2X2, Canada; telephone: (905) 521-2100, ext. 76207; e-mail: [email protected]. Acad Med. 2015;90:900–905. First published online April 15, 2015 doi: 10.1097/ACM.0000000000000707 Supplemental digital content for this article is available at http://links.lww.com/ACADMED/A271.

900

These instruments incorporate taskspecific checklists, behaviorally anchored task-specific and global performance ratings, and written comments. They are completed by faculty following direct observation of residents during shifts. The rotation preceptor uses aggregated data to complete an end-of-rotation report for each resident in the form of a qualitative global assessment of performance. Outcomes The quality of end-of-rotation reports— as measured by comparing report quality one year prior to and one year after McMAP implementation using the Completed Clinical Evaluation Report

competence, generalist programs such as emergency medicine (EM)—which must cover a broad content and skills base— require a substantial number of workbased assessment (WBA) instruments that may include qualitative as well as quantitative measures. In this report, we describe the development of the McMaster Modular Assessment Program (McMAP), a novel WBA system designed to integrate both quantitative and qualitative measures to generate robust reports on resident performance in the McMaster University Royal College EM residency program. We report our initial experience using McMAP to assess the performance of residents at the junior and intermediate levels (roughly postgraduate years 1 and 2). Approach

The junior and intermediate levels of McMAP were developed in 2010–2011 by a team of 27 medical educators, education scientists, and residents from six institutions in Canada and the United States. This project was granted an ethics

Rating tool—has improved significantly (P < .001). This may be a result of basing McMAP’s end-of-rotation reports on robust documentation of performance by multiple raters throughout a rotation rather than relying on a single faculty member’s recall at rotation’s end as in the previous system. Next Steps By aligning theory-based assessment instruments with authentic EM workbased tasks, McMAP has changed the residency program’s culture to normalize daily feedback. Next steps include determining how to handle “big data” in assessment and delineating policies for promotion decisions.

exemption by the Hamilton Integrated Research Ethics Board. McMAP development The McMaster University EM residency program’s previous resident assessment system consisted of end-of-rotation reports that contained 67 Likert-scaled items and 2 optional narrative comment fields. These reports were filled out by a single faculty member at the end of a monthlong rotation. There was no system in place to record day-to-day observations to inform these end-ofrotation reports. In 2010, our residency program completed a targeted needs assessment to define perceived and identified needs for our resident assessment system. As part of this quality improvement effort, 36/53 faculty members (68%) and 30/31 residents (97%) participated in a series of focus groups and completed a survey. The themes that were identified and triangulated via cross-referencing with the assessment literature are included in Supplemental Digital Appendix 1, available at http://links.lww.com/ ACADMED/A271.

Academic Medicine, Vol. 90, No. 7 / July 2015

Innovation Report

With the results of this needs assessment in mind, a group of educators at the McMaster University EM residency program collaborated with educators in the EM residency programs at the University of Alberta and the University of Saskatchewan to develop an assessment program organized around the CanMEDS physician competency framework and based on educational theory from the assessment literature. Two-person teams were tasked with developing eight EM-specific WBA instruments. These instruments were structured as focused, partial mini-clinical evaluation exercises (CEXs)—essentially “micro”-CEXs—that could be used in a busy emergency department (ED) environment. Each instrument was mapped to a physician role from the CanMEDS competency framework at the junior level and the intermediate level. Each instrument was designed to provide a template for clinical faculty to assess resident competence in a key EM clinical task (e.g., performing a history, charting, obtaining consent for therapy) via direct observation. All instruments were peer reviewed and refined based on feedback.

The final 42 WBA instruments comprising McMAP were bundled along common themes (i.e., CanMEDS roles) for use during our residency program’s four-week rotations. Our residents follow a preset, annual schedule with different CanMEDS roles emphasized each month. The instruments are delivered in a deliberate manner over two years. The 42 WBA instruments are divided into groups of 7 to 8 (i.e., blocks) that emphasize two CanMEDS roles at a time to accommodate focused assessment of practice within a rotation. Each instrument is repeated at least twice per year to allow for convenience sampling based on case presentations that arise each day. This approach facilitates a spiral curriculum with a return to common tasks for greater depth of learning. McMAP components Every shift, residents are observed by an attending physician, who rates their performance of a specific, defined task and their global performance. The observation and documentation takes the faculty member 5 to 10 minutes per resident per shift. (For a sample daily task checklist and rating instrument, see Appendix 1. For a sample daily global rating instrument, see Appendix 2.)

Specific tasks. Assessment tasks and criterion standards are mapped to level of training, providing milestones for performance. Most of the instruments (38/42; 90%) include a structured task checklist, and all instruments use behaviorally anchored scales, which guide faculty assessors to ensure a shared mental model among faculty raters.3 The criterion-based, standardized anchors that define each level of achievement (i.e., “needs assistance” through “ready for the next level”) assist faculty members in rating residents’ performance of tasks in a consistent manner. Nearly all of the instruments (40/42; 95%) facilitate opportunistic direct observation; only 2 (5%) allow for a simulation option (e.g., a hypothetical response in lieu of a real patient case). Figure 1 shows the distribution of instruments by their primary CanMEDS roles for postgraduate years 1 and 2. Global performance. The daily global rating instrument is completed to capture the resident’s global performance of all tasks during that shift. It uses milestonebased behavioral anchors that align to a progression of competence (ranging from “needs assistance” to “ready to be an intermediate resident” or “ready to be a senior resident”). This global rating allows the faculty member to assess the entirety of a learner’s behavior during a shift, beyond the specific task emphasized that day. Exceptional events. In 2012, an exceptional events reporting system was added, separate from the daily WBA instrument, to increase documentation of exceptional performance. This facet of the assessment system allows an attending

Percentage distribution

In total, 52 WBA instruments were originally created or adapted from existing assessment instruments. All of these instruments were reviewed by an international panel consisting of four clinical content experts (two American and two Canadian attending EM physicians) and two EM residents (one American and one Canadian). The Americans were from the EM residency programs of Louisiana State University, Michigan State University, and Oregon Health & Science University. Each of these reviewers completed a Q-sort, matching the WBA instruments to

either the CanMEDS or the Accreditation Council for Graduate Medical Education competency framework. This method was used to check the system for its relevance (i.e., adequate sampling of competencies), as well as its representativeness across postgraduate years and competency frameworks. Ten instruments were removed during this peer-review process.

CanMEDS roles Figure 1 Distribution of the 42 McMaster Modular Assessment Program (McMAP) work-based assessment instruments by the primary CanMEDS roles to which they map, by level of training, McMaster University emergency medicine residency program. PGY indicates postgraduate year.

Academic Medicine, Vol. 90, No. 7 / July 2015

901

Innovation Report

physician to confidentially submit information about exceptionally good and bad performance to a third-party mediator. Thus far, this reporting system has shown promise in increasing the sensitivity of our assessment system to detect outlier behaviors from residents. The role of qualitative data For a McMAP task or global rating instrument to be complete, the rater is required to provide narrative comments to augment the numerical scores. These qualitative data provide a “thick” description of resident performance and prompt formative feedback at the end of the shift.2 The role of “choice architecture” To ensure that we continually evaluate and improve McMAP, we are using a continuous quality improvement (CQI) process. Via surveys and focus groups of residents and faculty, we have found that the biggest draw of McMAP is the translation of physician competencies (e.g., CanMEDS roles) into clinically identifiable, EM-specific tasks. The use of behaviorally anchored scales that guide faculty assessors, the use of checklists that deconstruct tasks for residents (and junior faculty) into simpler subelements, and the inclusion of mandatory qualitative comments are examples of the use of choice architecture in McMAP. These features focus faculty members and residents toward best assessment practices. In essence, these assessment tasks serve as a form of “justin-time” faculty development, providing guidance to clinical teachers to help them diagnose areas of concern and achievement for residents. In addition, the alignment of our WBAs to authentic EM tasks, rather than to generic physician competencies, helps faculty generate more specific and actionable feedback. Whereas our previous system was organized around general CanMEDS roles (e.g., Communicator), McMAP focuses on observable EM tasks (e.g., charting) that can be reliably mapped back to a specific CanMEDS role. Finally, multiple assessments of the components of each CanMEDS role allow for more reliable and specific assessment of a competence across the entire physician competency framework. Reports All assessments are directly entered into McMAP’s online portal (http:// mcmapevents.wix.com/portal),

902

which aggregates the information into password-protected personalized databases that residents can review as desired. At the end of each monthlong rotation, all data from the instruments completed for each resident (approximately 15 task-specific ratings and 15 global performance ratings, with 30 narrative comments) are compiled from this electronic data collection system to generate a draft report of each resident’s performance. Incomplete assessments and borderline/ failing ratings are highlighted. Taken together, these reports result in a sampling of different skill sets throughout the year, with each block emphasizing different tasks, which in turn are assembled to provide an overall picture of each resident’s performance over multiple months. Each resident’s aggregated data report from the end of a monthlong rotation is provided to the rotation’s preceptor, who performs a thematic analysis. This results in a qualitative end-of-rotation report that discusses (1) the resident’s performance on the specific tasks that map to a specific CanMEDS role emphasized during the rotation; (2) the resident’s global performance; and (3) tailored advice for the resident for continuous learner improvement. The preceptor flags marginal performances for review by the residency education committee’s assessment subcommittee, which recommends remediation plans. Decision-making processes To make promotion and remediation decisions, we use a mixed model that incorporates both cut points (i.e., preestablished minimal standards for resident performance) and jury-based review (in which major stakeholders, including residents and faculty, assess the aggregated evidence from the successive months to inform decisions about promotion or remediation). This mixed model is new and replaces a system wherein only the program director and assistant program director would periodically review resident reports and/or exam scores. In 2015, our residency program adopted a continuous review system (overseen by the assessment subcommittee) that flags anomalies in resident performance and advises the main decision-making body (the residency education committee).

Outcomes

McMAP was piloted for junior- and intermediate-level residents in academic year 2011–2012. In the first nine months, we gathered more than 4,000 data points for 15 residents in postgraduate years 1 and 2. These data points were 38% qualitative (written comments) and 62% quantitative (completed checklists, ratings of tasks or daily global performance). Our system generated 64 aggregated score reports and 64 end-ofrotation reports. To determine the efficacy of McMAP, we audited the quality of the end-of-rotation reports using the Completed Clinical Evaluation Report Rating (CCERR) tool.4 The CCERR tool, a nine-item scoring system to evaluate the quality of endof-rotation reports, has been previously validated across a wide range of specialties and has demonstrated high reliability.4 We compared CCERR scores of endof-rotation reports from before and after the introduction of McMAP. We randomly selected 25 end-of-rotation reports for postgraduate year 1 and 2 residents from a pre-McMAP year (2010–2011, the year before McMAP was introduced) and an early McMAP year (2012–2013, the year after McMAP was piloted). Unique identifiers were redacted from all of these reports. All 50 reports were independently scored by two investigators (T.C., J.S.) using the CCERR tool. The level of agreement between the two raters on the CCERR scale was high (Cronbach alpha = 0.92; df = 49, P < .001). There was a doubling of median CCERR scores from the preMcMAP year to the early McMAP year (13.8/45 [interquartile range = 11.3– 15.8] versus 27.5/45 [interquartile range = 20.5–23.5]; P 4,000 data points for 15 residents in nine months). The medical education literature, however, mainly describes processes for handling big data in relation to standardized examinations, which are not intended for either longitudinal assessment or mixed forms of data. Effective promotion or remediation decisions will require credible interpretation of data, which is incumbent on (1) data representation and score compilation and (2) policies based on defined markers of competence. McMAP data lend themselves to analysis of an individual resident’s learning trajectory, but there are only a few published processes (none of them validated) in medical education to guide the use of such metrics. Policies for promotion decisions Without policies to guide promotion decisions, McMAP is merely a comprehensive data collection system. McMAP end-of-rotation reports provide the resident education committee’s assessment subcommittee with robust data to inform decisions about promotion or remediation. However, to convert McMAP into a true CBME system, policies that allow for the tailored progression of learners as a function of their assessments and learning trajectories are required. Recognizing that competence is often task specific, and not generalizable across all EM domains, we must determine at what point a resident should be permitted to advance to a more senior role. For example, does every milestone need to be achieved for a particular stage of training to justify promotion, or must a resident achieve only a key, representative sampling of milestones? These questions have yet to be answered.

Conclusions McMAP provides a functioning model for a WBA system that incorporates both task-specific and global assessments of resident performance and generates a significant amount of specific and informative qualitative and quantitative data on each trainee. The assessment instruments provide faculty assessors with rubrics to align their frames of reference and provide just-in-time faculty development. By aligning assessment instruments with authentic EM work-based tasks, this novel system has changed our residency program’s culture to normalize daily feedback. Acknowledgments: The McMaster Modular Assessment Program (McMAP) Collaborators are a team of 25 educators and education scientists and 2 residents from three Canadian universities (McMaster University, the University of Alberta, and the University of Saskatchewan) and three U.S. universities (Louisiana State University, Michigan State University, and Oregon Health & Science University) who developed and reviewed the McMAP instruments. The authors would like to acknowledge the hard work of their fellow McMAP Collaborators (M. Ackerman, J. Cherian, N. Delbel, K. Dong, S. Dong, K. Hawley, M. Jalayer, B. Judge, R. Kerr, A. Kirkham, N. Lalani, A.R. Mallin, S. McClennan, P. Miller, A. Pardhan, G. Rutledge, K. Schiff, D. Sehdev, T. Swoboda, S. Upadhye, R. Valani, C. Wallner, M. Welsford, R. Woods, and A. Zaki). They also wish to thank the McMaster University Division of Emergency Medicine administrators (Teresa Vallera, Melissa Hymers, Neha Dharwan, and Amanda Li). In addition, the authors thank their friends and research colleagues, Dr. Kelly Dore, Dr. Geoff Norman, and Dr. Meghan McConnell, for their advice on this project. Finally, the authors would like to thank Dr. Ian Preyra (former program director of the Royal College Emergency Medicine Program), Dr. Alim Pardhan (program director of the Royal College Emergency Medicine Program), and Dr. Karen Schiff (associate program director of the Royal College Emergency Medicine Program) for providing the support, time, and mandate to implement McMAP. Funding/Support: This program was supported by the McMaster University Division of Emergency Medicine within the Department of Medicine. Other disclosures: T. Chan has been supported by the Royal College of Physicians and Surgeons of Canada’s Fellowship for Studies in Medical Education. J. Sherbino is a clinician educator at the Royal College of Physicians and Surgeons of Canada. They have both received grants from the Royal College of Physicians and Surgeons of Canada.

903

Innovation Report Ethical approval: This project and all of its component analyses received an exemption from the Hamilton Integrated Research Ethics Board.

References 1 Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in

4 Dudek NL, Marks MB, Wood TJ, Lee AC. Assessing the quality of supervisors’ completed clinical evaluation reports. Med Educ. 2008;42:816–822. 5 Moonen-van Loon JM, Overeem K, Donkers HH, van der Vleuten CP, Driessen EW. Composite reliability of a workplace-based assessment toolbox for postgraduate medical education. Adv Health Sci Educ Theory Pract. 2013;18:1087–1102.

competency-based medical education. Med Teach. 2010;32:676–682. 2 Hodges B. Assessment in the post-psychometric era: Learning to love the subjective and collective. Med Teach. 2013;35:564–568. 3 Kogan JR, Conforti L, Bernabeo E, Iobst W, Holmboe E. Opening the black box of clinical skills assessment via observation: A conceptual model. Med Educ. 2011;45:1048–1060.

Appendix 1 Sample Intermediate-Level McMaster Modular Assessment Program (McMAP) Daily Task Checklist and Rating Instrument, McMaster University Emergency Medicine Residency Programa,b

Name of Assessor: ___________________ Date: _________________ Minor Task | Discharge Instructions Today’s focus is on discharge instructions. Checklist

Done but needs attention

Done

Not done

N/A for case

Informs patient of results of any investigations in simple language Informs patient of diagnosis (if possible), other possible diagnoses, and describes prognosis (if possible) Informs patient of care plan (overall)  (a) explains any prescriptions (rationale for use, potential side effects)  (b) logistics of follow-up (confirm phone number, give consultant contact info, explains how to return for next-day testing)  (c) contingency plan (return instructions, symptoms of serious diagnosis or complication) Ensures patient understands diagnosis and care plan Rate this task | Circle number that best describes level of proficiency 1 Needs assistance Resident displays any of the below: •  Significant gaps in discharge instruction (see checklist) •  Overly medicalized jargon •  Confusing to patient •  Conflict arose and escalated

2

3

4

5

Resident displays most of the below: •  Inefficient with time •  Used complicated concepts or jargon at times •  Patient’s questions were answered most of the time •  Eventually, arrived at a plan that was amenable to all parties

6

7 Ready for the next level Resident displays ALL of the below: •  Time efficient •  Catered to patient’s level of understanding and needs •  Answered questions from patient and/or family •  Arrived at a plan that was amenable to all parties easily

The evidence: Please provide an example with an explanation that supports your rating (mandatory): The next step: Based on the above evidence, please give one specific suggestion (educational prescription) for the resident to attempt during his/ her next shift. (You do not need to record this.)

The task checklist assists with creating a shared mental model of the ideal; it also provides guides for raters and helps diagnose learner’s areas for improvement. (Some tasks do not have checklists.) The task rating’s behaviorally anchored rubric contains specific explanations that allow raters to hone in on expectations tailored to the learner’s level of training. The comments area requires raters to provide qualitative commentary on performance (“the evidence”) and advice for proceeding forward. b The daily task checklist and task rating are completed in conjunction with an end-of-shift assessment of global performance. For the global rating instrument, see Appendix 2. a

904

Academic Medicine, Vol. 90, No. 7 / July 2015

Innovation Report

Appendix 2 Sample Intermediate-Level McMaster Modular Assessment Program (McMAP) Daily Global Rating Instrument, McMaster University Emergency Medicine Residency Programa,b Intermediate Daily Global Rating Circle number that best describes level of proficiency 1 Needs assistance

2

3

4

Any of the following apply to the Most of the following apply to intermediate resident: the intermediate resident: •  Displays major areas of •  Integrates well within the ED knowledge deficit* environment (culture, logistics, collaboration) •  Displays major weaknesses with functioning in the ED •  Has appropriate intermediateenvironment (culture, logistics, level knowledge of EM collaboration) evidence and basic science •  Requires input, revision, •  Independently and accurately intervention, or attentive examines, diagnoses, and supervision from attending determines case plan for throughout shift noncritically ill patient(s) •  Performs actions that place •  Performs basic procedures patients at risk* safely with minimal supervision •  Has lapses in professional •  Effectively communicates behavior* with patient and colleagues (e.g., forms effective working •  Ineffectively or offensively relationships) communicates with patient(s) or colleague(s)* •  Is consistently professional •  Shows lack of insight into •  Develops a plan to begin own limitations or knowledge remedying knowledge gaps, gaps* limitations, deficits in exposure *Must comment below or flag this through the exceptional events system

5

6

7 Ready to be a senior resident The intermediate resident displays ALL of the following: •  Functioning proficiently and efficiently in the ED environment (culture, logistics, collaboration) •  Displays exceptional and nuanced knowledge of EM evidence and basic science* •  Able to independently and accurately examine, diagnose, and determine care plan for most patients (including the critically ill) •  Able to perform procedures safely with minimal supervision •  Communicates efficiently with patient and colleagues (displays empathy and forms good rapport) •  Role models exceptional professional behavior* •  Skilled at reflective practice and insight into own limitations, knowledge gaps; able to self-identify and plan for continued improvement

Comments/Concerns/Feedback

Date

Assessor Initials Resident Initials

Abbreviations: ED indicates emergency department; EM, emergency medicine. a The global rating, or shift rating, encompasses all aspects of the CanMEDS competency framework.5 The behaviorally anchored rubric contains specific explanations tailored to the learner’s level of training. The comments area requires raters to provide qualitative commentary on overall performance during the whole shift to allow tracking of other competencies not specifically observed in the day’s task. b The daily global rating is completed by the faculty rater at the end of the shift. Earlier in the shift, the faculty rater would have completed the specific task worksheet. For a sample daily task checklist and rating instrument, see Appendix 1.

Academic Medicine, Vol. 90, No. 7 / July 2015

905

The McMaster Modular Assessment Program (McMAP): A Theoretically Grounded Work-Based Assessment System for an Emergency Medicine Residency Program.

To assess resident competence, generalist programs such as emergency medicine (EM), which cover a broad content and skills base, require a substantial...
361KB Sizes 0 Downloads 9 Views