LETTER TO THE EDITOR

Letter to the Editor Response Dear Editor, We would like to thank Jones and Phillips for their response to our review article entitled “Reflections on CompetencyBased Education and Training for Surgical Residents.”1 We largely agree with many of the points that they raise. Particularly, we agree that there can be a tendency to use workplacebased assessments (WPBAs) as summative evaluations, which are the bases for decisions, as opposed to low-stakes learning opportunities. Although it is true that some WPBAs may have been “adapted to surgical procedures without any real evidence of their validity,” this is not the case for all, and it is important to note that there are many different types of WPBAs. Although many of these were developed primarily for providing formative feedback, some were intended (or have been adapted) to help inform higher-stakes decisions.2-4 Case-based discussions have been created to assess aspects of clinical judgment and decision making through a structured discussion between trainees and their supervisors.5 This conversation is intended to allow for immediate, formative feedback to be given and help trainees improve their performance. By contrast, although procedure-based assessments (PBAs) can facilitate formative feedback, they were originally developed to assess competence in performing specific procedures.4 PBAs were intended to be used frequently over a period of several years to inform a portfolio of evidence that could guide summative decisions regarding proficiency, certification, and preparedness for independent practice.3,4,6 The direct observation of procedural skill is another tool that assesses traineesʼ competence in performing procedures and is meant to facilitate formative feedback.3 However, unlike PBAs, the direct observation of procedural skill is not well suited to inform high-stakes decisions, as it is not criterion referenced.3 Other WPBAs include the clinical evaluation exercise and multi-source feedback. The former is intended to provide immediate verbal feedback for learning. Although the latter also generates feedback, it may be used to determine whether there are serious concerns about a traineeʼs morality and whether further action is required.2 It is therefore important to recognize that WPBAs are not homogeneous. They should be considered separately and educators must ensure that the correct tools are being used in the proper contexts, either to provide feedback or to help inform summative judgments. Although a variety of such tools exist, at least in Canada, there is currently a paucity of low-stakes assessments being Correspondence: Inquiries to Dr. Ranil Sonnadara, Department of Surgery, McMaster University, A. N. Bourns Science Building, Room 131, 1280 Main Street West, Hamilton, Ontario, L8S 4K1, Canada; E-mail: [email protected]

652

conducted throughout training. Supervisors often view assessments to be time consuming, burdensome, and not essential to learning.7,8 In response to these issues, the Royal College of Physicians and Surgeons of Canada has recently called for a focus to be placed on improving formative assessment practices in postgraduate medical education, suggesting that low-stakes assessments become fully integrated into all aspects of clinical teaching.9 Although this need has been widely recognized, it may be challenging to implement in practice. We believe that there is a great need for a cultural shift in surgery (and medicine in general) toward the use of assessment instruments and feedback as a primary mechanism for learning and skills development, a “stepping stone” to help trainees along their journey as opposed to a barrier to the next step in their training. Viewed through this lens, perhaps WPBAs may best be used as part of a portfoliobased approach to assessment, where they are but one piece of an assessment puzzle which spans the entire length of training. Van der Vleuten et al.10 suggest a programmatic approach whereby frequent assessments are carried out longitudinally and used both to enhance learning and to inform certification decisions. Van der Vleuten et al.10 propose that individual assessments might be aggregated to provide a body of evidence to support higher-stakes pass or fail decisions. Certainly this approach has been extremely effective in the pilot competency-based curriculum in orthopedics that is currently being tested at the University of Toronto. It is our hope that more widespread adoption of a model that considers performance over an extended period of time may help lessen reliance on terminal, summative examinations, while providing essential feedback to trainees.9 This might enable a paradigm shift among both trainees and training programs to one that has meaningful formative assessment as a core learning mechanism. Yours sincerely, Ranil R. Sonnadara, PhD,*,† Carween Mui, MD(c),‡ Sydney McQueen, MSc(c),* Polina Mironova, MSc(c),†,§ Markku Nousiainen, MD,† Oleg Safir, MD,†,§ William Kraemer, MD,† Peter Ferguson, MD,†,§ Benjamin Alman, MD,†,II Richard Reznick, MD¶ * Department of Surgery, McMaster University, Hamilton Ontario, Canada

Journal of Surgical Education  & 2014 Association of Program Directors in Surgery. Published by 1931-7204/$30.00 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.jsurg.2014.05.018



Department of Surgery, University of Toronto Ontario, Canada ‡ The University of British Columbia, Vancouver, British Columbia, Canada § Mount Sinai Hospital, Toronto, Ontario, Canada II Department of Surgery, Duke University, Durham North Carolina, USA ¶ Department of Surgery, Queen’s University, Kingston Ontario, Canada

editors. Safer Surgery: Analysing Behaviour in the Operating Theatre. UK: Ashgate, 2009. p. 27-46. 5. Norcini J, Burch V. Workplace-based assessment as an

educational tool: AMEE guide no. 31. Med Teach. 2007;29(9-10):855-871. 6. Marriott J, Purdie H, Crossley J, Beard JD. Evaluation of

procedure-based assessment for assessing trainees’ skills in the operating theatre. Br J Surg. 2011;98(3):450-457. 7. Wanzel KR, Ward M, Reznick RK. Teaching the

REFERENCES 1. Sonnadara RR, Mui C, McQueen S, et al. Reflections

on competency-based education and training for surgical residents. J Surg Educ. 2014;71(1):151-158. 2. Workplace Based Assessments. Intercollegiate Surgical Curri-

culum Programme. Website of The Royal College of Surgeons of England. Available at: 〈http://www.iscp.ac.uk/ surgical/assessment_wba.aspx/〉 Accessed 28.03.14. 3. Memon MA, Brigden D, Subramanya MS, Memon B.

Assessing the surgeon’s technical skills: analysis of the available tools. Acad Med. 2010;85(5):869-880. 4. Pitts D, Rowley D. Competence evaluation in ortho-

paedics—a “bottom up” approach. Flin R, Mitchell L,

surgical craft: from selection to certification. Curr Prob Surg. 2002;39(6):573-659. 8. Gosman GG, Simhan HN, Guido RS, Lee TTM,

Mansuria SM, Sanfilippo JS. Focused assessment of surgical performance: difficulty with faculty compliance. Am J Obstet Gynecol. 2005;193(5):1811-1816. 9. Future of Medical Education in Canada: Assessment.

Royal College White Paper Series. In: Royal College of Physicians and Surgeons of Canada [website of the RCPSC]. Available at: 〈http://www.royalcollege.ca/ portal/page/portal/rc/common/documents/educational_ initiatives/assessment.pdf/〉 Accessed 10.11.13. 10. Van der Vleuten CPM, Schuwirth LWT, Driessen

EW, et al. A model for programmatic assessment fit for purpose. Med Teach. 2012;34(3):205-214.

Journal of Surgical Education  Volume 71/Number 5  September/October 2014

653

Letter to the editor response.

Letter to the editor response. - PDF Download Free
74KB Sizes 0 Downloads 6 Views