Systematic review

Systematic review of skills transfer after surgical simulation-based training S. R. Dawe1 , G. N. Pena1,2 , J. A. Windsor4 , J. A. J. L. Broeders2 , P. C. Cregan3 , P. J. Hewett2 and G. J. Maddern1,2 1

Australian Safety and Efficacy Register of New Interventional Procedures – Surgical (ASERNIP-S), Royal Australasian College of Surgeons, and Discipline of Surgery, University of Adelaide, Queen Elizabeth Hospital, Adelaide, South Australia, and 3 Department of Surgery, University of Sydney, Nepean Clinical School, Penrith, New South Wales, Australia, and 4 Department of Surgery, University of Auckland, Auckland City Hospital, Auckland, New Zealand Correspondence to: Professor G. J. Maddern, Australian Safety and Efficacy Register of New Interventional Procedures – Surgical (ASERNIP-S), Royal Australasian College of Surgeons, 199 Ward Street, North Adelaide, South Australia 5006, Australia (e-mail: [email protected]) 2

Background: Simulation-based training assumes that skills are directly transferable to the patient-based setting, but few studies have correlated simulated performance with surgical performance. Methods: A systematic search strategy was undertaken to find studies published since the last systematic review, published in 2007. Inclusion of articles was determined using a predetermined protocol, independent assessment by two reviewers and a final consensus decision. Studies that reported on the use of surgical simulation-based training and assessed the transferability of the acquired skills to a patient-based setting were included. Results: Twenty-seven randomized clinical trials and seven non-randomized comparative studies were included. Fourteen studies investigated laparoscopic procedures, 13 endoscopic procedures and seven other procedures. These studies provided strong evidence that participants who reached proficiency in simulation-based training performed better in the patient-based setting than their counterparts who did not have simulation-based training. Simulation-based training was equally as effective as patientbased training for colonoscopy, laparoscopic camera navigation and endoscopic sinus surgery in the patient-based setting. Conclusion: These studies strengthen the evidence that simulation-based training, as part of a structured programme and incorporating predetermined proficiency levels, results in skills transfer to the operative setting.

Paper accepted 30 January 2014 Published online 15 May 2014 in Wiley Online Library (www.bjs.co.uk). DOI: 10.1002/bjs.9482

Introduction

Simulation-based training allows trainees to learn technical and non-technical skills without risking patient safety1,2 . It is increasingly being incorporated into surgical training or mandated by registration bodies3 . Before simulation is incorporated into training curricula, important questions must be answered: does it work (train the appropriate skills) and how well does it work (does it improve skills performance in the patient-based setting and how strong is the evidence4,5 )? A systematic review6 published in 2007, which included 12 randomized clinical trials (RCTs) and two nonrandomized comparative studies, found that surgical simulation-based training appeared to result in skills transfer to the patient-based setting. However, the included © 2014 BJS Society Ltd Published by John Wiley & Sons Ltd

studies were limited by design and variable quality which reduced the strength of the conclusions. The emergence of new information is important7 and the publication of studies in the past 6 years, during which there have been improvements in simulation-based training programmes, may have modified this conclusion. An important aspect has been the requirement to reach a predetermined level of proficiency in simulation-based training (often related to the ‘expert’ surgeon’s level)8 – 11 before the trainee proceeds to the patient-based setting. Whether the skills learnt by a trainee in surgical simulation-based training transfer to performance in the patient-based setting must be tested for compliance with standards to meet patient need. Recent studies acknowledge the importance of objective assessment for BJS 2014; 101: 1063–1076

1064

S. R. Dawe, G. N. Pena, J. A. Windsor, J. A. J. L. Broeders, P. C. Cregan, P. J. Hewett and G. J. Maddern

measuring skills competence in the patient-based setting that will enable the definition of proficiency thresholds, and the comparison between simulator and clinical training. A number of procedure-specific objective global assessment scales have been developed and validated, such as the Global Operative Assessment of Laparoscopic Skills (GOALS)12 and the Objective Structured Assessment of Technical Skills (OSATS)13 . Additional measures of competence that have been applied to the determination of technical skills transfer include time to complete the procedure, error rates, degree of bimanual dexterity, whether patient discomfort is avoided, level of operator confidence, and patient complication rates during and after the procedure. This systematic review aimed to determine whether skills acquired through surgical simulation-based training transfer effectively to the patient-based setting. Methods

This updated systematic review aimed to search for and identify new evidence to incorporate into the previously completed systematic review6 . The update was designed to meet the same five criteria as the original systematic review: a focused clinical question; an explicit search strategy; the use of explicit, reproducible and uniformly applied criteria for article selection; critical appraisal of the included studies; and qualitative or quantitative data synthesis14 . The literature search strategy, inclusion and exclusion criteria, and data extraction and analysis methods followed exactly those in the original systematic review. Studies included in the present review were published after the previous search had been completed. In this review, ‘patient-based setting’ covers terms such as the operating room (OR), the procedure room, clinical setting and operative setting. All RCTs and non-randomized comparative studies (non-RCTs) reporting the use of surgical simulationbased training and assessing the transfer of surgical skills to the patient-based setting were included for review. A detailed electronic search was carried out in the following databases: Embase, Cumulative Index to Nursing and Allied Health Literature (CINAHL), PubMed, the Cochrane Library and Current Contents, Clinical Trials Database (US), NHS Centre for Research and Dissemination Databases (UK), National Research Register (UK), Meta Register of Controlled Trials, and the Australian Clinical Trials Registry. The search terms used were (surg* and simulat*) and (skill* or train*). No language limitation was applied to the search. Foreign-language papers were subsequently excluded unless the findings © 2014 BJS Society Ltd Published by John Wiley & Sons Ltd

provided additional information over that reported in well designed studies published in the English language. All studies published from January 2007 to August 2013 were considered (studies published earlier were included in the previous review6 ). Hand-searching was then undertaken to locate articles that may have been missed by electronic database searches.

Inclusion and exclusion criteria Included studies contained information on measures of task performance in the patient-based setting. Assessment methods comprised global rating score, pass/fail score, time to complete task/procedure and errors/patient discomfort. No restriction on the types of skill trained/or assessed was imposed. Studies that measured performance only via simulation or were restricted to animals were excluded. Studies in which simulation was used for warm-up or procedure rehearsal were not included. Two reviewers independently examined all retrieved studies, and any disagreement over inclusion or exclusion was discussed and a consensus reached.

Data extraction and analysis Data from all included studies were extracted by one researcher and checked by a second using standard data extraction tables developed a priori. Each included study was appraised critically for study quality and assigned a level of evidence according to the hierarchy of evidence developed by the National Health and Medical Research Council of Australia15 . Study quality was assessed according to the methods given in the Cochrane reviewers’ handbook16 on a number of parameters, including quality of the study methodology reporting, methods of randomization and allocation concealment (for RCTs), blinding of trainers and outcomes assessors, and sample sizes. It was judged that no data were suitable for statistical pooling owing to the heterogeneity of the results. Results

From 5450 potentially relevant articles, a total of 34 studies were identified: 27 RCTs and seven nonRCTs that investigated skills transfer after simulationbased training for 19 procedures/tasks were included in this review (Fig. 1). Fourteen studies17 – 30 investigated laparoscopic procedures (Table 1). The procedures/tasks studied were laparoscopic cholecystectomy17,19,20,22,25,30 , laparoscopic tubal ligation18,21 , salpingectomy23 , total extraperitoneal inguinal hernia repair27 , intracorporeal www.bjs.co.uk

BJS 2014; 101: 1063–1076

Skills transfer after surgical simulation-based training

1065

Potentially relevant articles identified and screened for retrieval n = 5450 Identified from pearling n = 8

Excluded after application of exclusion criteria n = 5203 Duplicates n = 1652 Title/abstract not relevant n = 3551

Articles retrieved for more detailed evaluation n = 255

Potentially appropriate articles to be included with usable information n = 34

Excluded n = 221 Assessed via simulation n = 59 Reviews/editorials/comments n = 98 Assessed in animals n = 32 No control group n = 10 Not surgical n = 7 No paper available (abstract only) n = 6 Duplicates (before 2007) n = 2 Use of simulation for warm-up or rehearsal n = 6 Not in English n = 1

Randomized clinical trials n = 27 Non-randomized comparative studies n = 7 Fig. 1

Flow chart showing selection of articles for review.

knot suturing during Nissen fundoplication24,26 , right hemicolectomy29 and camera navigation28 . Thirteen studies31 – 43 investigated endoscopic procedures, including colonoscopy36,40,43 , oesophogastroduodenoscopy32,38,39,41 , cystourethroscopy37 , flexible transnasal laryngoscopy31,35 , endoscopic sinus surgery33,42 and transurethral resection of prostate (TURP)34 (Table 2). Seven studies44 – 50 investigated other procedures, namely knee arthroscopy46 , abdominal fascial closure49 , cardiac catheterization44 , superficial femoral artery angioplasty47 and cataract surgery45,48,50 (Table 3). Participants included interns and trainees in general surgery, gynaecology, urology, ophthalmology, otolaryngology, orthopaedics, internal medicine and gastroenterology. Medical students were the novices for three studies involving camera navigation28 and transnasal flexible endoscopy31,35 . Six20,31,33,34,36,38 of the 27 RCTs did not report the method of randomization, and only three21,23,27 reported clearly how the randomization process, including sequence generation, allocation concealment and implementation, occurred. Power calculation was reported by 15 RCTs21,25,27 – 32,34,36,37,41,43,44,49 . The sample sizes within the included studies were generally small, but four RCTs21,27,31,37 had samples of more than 25 participants per group. Main findings of the studies are summarized in © 2014 BJS Society Ltd Published by John Wiley & Sons Ltd

Tables 1–3. More detailed information is available online (Tables S1–S3, supporting information).

Studies reporting overall performance Eighteen18 – 21,23 – 25,27,29,30,34,36,37,43,44,46,47,49 of the 34 studies reported an overall performance parameter. This was a global summary of all objective performance parameters measured during patient-based assessment procedures or the assessor’s evaluation of overall performance. One additional study22 used a validated global rating scale for individual parameters but did not report an overall performance score (or sum the individual scores). Although the majority of studies used a validated global rating scale score, the scales differed for each procedure, and reporting differed even within procedures (for example mean or median score). One study40 reported an overall performance accuracy score, another38 reported a competence score and one41 described a skills score in a visual analogue scale rather than a global rating score; these are also reported below.

Laparoscopic procedures (11 studies) Nine RCTs18,20,21,23 – 25,27,29,30 and one non-RCT19 reported that simulator-trained participants scored signifiwww.bjs.co.uk

BJS 2014; 101: 1063–1076

1066

Table 1

S. R. Dawe, G. N. Pena, J. A. Windsor, J. A. J. L. Broeders, P. C. Cregan, P. J. Hewett and G. J. Maddern

Laparoscopic procedures Participants and simulators

Reference and LOE

Simulation training versus no simulation training Ahlberg et al.17 (2007) Surgical residents PGY Sweden 1–2 IG 7; LapSim® VR RCT II simulator CG 6 Interns PGY 1 Banks et al.18 (2007) USA IG 10; Limbs and Things RCT II laparoscopic simulator CG 10 Beyer et al.19 (2011) General surgery or France gynaecology residents IG1 6; FLS Training Box Non-RCT III-2 simulator IG2 6; LAP Mentor™ VR simulator CG 7 Basic surgical trainees Cosman et al.20 (2007) Australia IG 5; LapSim® VR RCT II simulator CG 5

Assessment procedure

Results

Cholecystectomy

IG made fewer errors for entire procedure (P = 0·004), exposure (P = 0·040), clipping and tissue division (P < 0·008) and dissection (P < 0·031) compared with CG. Procedure time shorter for IG than CG but not statistically different (P = 0·059)

Bilateral tubal ligation

IG scored higher than CG with all 3 evaluation tools: task-specific checklist (P = 0·002), OSATS (P = 0·003), pass–fail grade (P = 0·003)

Cholecystectomy (dissection of vesicular bed)

Improvement in GOALS scores in IG1 (P = 0·04) and IG2 (P = 0·03) but not in CG in 2nd evaluation (P = 0·35). No significant difference between IG1 and IG2 (P = 0·28)

Cholecystectomy (clip application and division of cystic artery)

IG had fewer intraoperative errors for entire procedure (P = 0·05), better bimanual coordination (P = 0·05) and higher global score (P = 0·04) than CG. Procedure time shorter for IG than CG but not statistically different (P = 0·075)

Gala et al.21 (2013) USA RCT II

Gynaecology residents IG 48; FLS Training Box simulator CG 54

Pomeroy bilateral tubal ligation

IG had higher OSATS progression score than CG (P = 0·03). Controlling for baseline OSATS score, being in IG was associated with an average increase of 2·2 (95% c.i. 1·29 to 3·2) points in final score

Hogle et al.22 (2009) (Study 1) USA RCT II

Surgical residents PGY 1 IG 6; LapSim® VR simulator CG 6 1st and 2nd year registrars in gynaecology and obstetrics IG 13; LapSim® VR simulator CG 11 General surgery residents IG1 10; FLS Training Box simulator IG2 10; LapSim® VR simulator CG 6 General surgical residents PGY 1–3 IG 8; FLS Training Box simulator CG 8 Senior surgical residents PGY 3, 5–6 IG 11; MIST-VR™ simulator CG 11 General surgery residents IG 26; Guildford MATTU TEP hernia task trainer CG 24

Cholecystectomy

No significant difference between IG and CG in 5 GOALS domains: depth perception (P = 0·99), bimanual dexterity (P = 0·55), efficiency (P = 0·93), tissue handling (P = 0·56), autonomy (P = 0·85) IG had higher score than CG in OSA-LS scale (P < 0·001). IG completed procedure in half the time compared with CG (P < 0·001)

Larsen et al.23 (2009) Denmark RCT II

Orzech et al.24 (2012) Canada Non-RCT* III-2

Sroka et al.25 (2010) Canada RCT II

Van Sickle et al.26 (2008) USA RCT II Zendejas et al.27 (2011) USA RCT II

© 2014 BJS Society Ltd Published by John Wiley & Sons Ltd

Salpingectomy

Nissen fundoplication (placement of intracorporeally knotted sutures)

No significant differences between IG1 and IG2 for time (P = 0·74), global rating score (P = 0·65) or checklist score (P = 0·97); however, both IGs performed significantly faster and better (OSATS and checklist) than CG (P not reported).The transfer effectiveness ratio was 1·13 for IG1 and 2·31 for IG2

Cholecystectomy (dissection of vesicular bed)

IG attained higher total GOALS score than CG (P < 0·001); GOALS domains: depth perception (P = 0·08), bimanual dexterity (P = 0·04), efficiency (P = 0·24), tissue handling (P = 0·04), autonomy (P = 0·58)

Nissen fundoplication (placement of intracorporeal sutures) TEP inguinal hernia repair

IG completed task in less time (P < 0·003), committed fewer suturing errors (P < 0·01) and had fewer excess needle manipulations (P < 0·05) than CG IG performed faster in 1st TEP procedure after randomization (P < 0·001) and had higher participation rates (P < 0·001). In subsequent repairs IG remained faster than CG. GOALS score higher in IG for 1st procedure (95% c.i. 2·1 to 5·1; P = 0·001) and for all TEP procedures after randomization (95% c.i. 2·1 to 5·1; P < 0·001). Complications and overnight stay less likely for 1st TEP procedure after randomization in IG (P < 0·05). No statistical difference for subsequent procedures

www.bjs.co.uk

BJS 2014; 101: 1063–1076

Skills transfer after surgical simulation-based training

Table 1

1067

Continued

Reference and LOE

Participants and simulators

Simulation training versus patient-based training Franzeck et al.28 (2012) Medical students Switzerland IG 12; LAP Mentor™ and RCT II ProMIS™ surgical hybrid simulator CG 12; traditional training in OR

Assessment procedure

Results

Camera navigation in OR during procedure

No significant difference between groups in any parameter after training: organ visualization (P = 0·45), horizon alignment (P = 0·08), time to completion (P = 0·12) and correct scope rotation handling (P = 0·60). Participants in both groups spent equal time actually training on camera navigation (P = 0·20). However, CG spent significantly more overall time in OR than IG spent in skills laboratory (P < 0·01) Simulation training as part of comprehensive curriculum in additional to residency training versus conventional residency training 29 General surgery residents Right hemicolectomy IG attained higher level of technical proficiency than CG: Palter and Grantcharov (2012) PGY 2–4 OSATS score (P = 0·030), procedure-specific score Canada IG 9; curriculum including (P = 0·122). IG residents able to perform more operative RCT II simulation training on steps than CG residents (P = 0·021) LapSim® VR simulator CG 9 Palter et al.30 (2013) General surgery residents Cholecystectomy IG outperformed CG in the first 4 laparoscopic Canada PGY 1–2 cholecystectomies measured on OSATS rating scale IG 9, curriculum including RCT II (P = 0·004, P = 0·036, P = 0·021, P = 0·023). No significant simulation training on difference in score between groups for 5th procedure LapSim® VR simulator (P = 0·065) and on FLS Training Box simulator CG 9

*This study comprised a randomized arm (comparing simulation-trained groups), for which sample size analysis was done, and a non-randomized arm (comparing simulation-trained groups with control group). LOE, level of evidence according to National Health and Medical Research Council of Australia15 ; RCT, randomized clinical trial; PGY, postgraduate year; IG, intervention group; CG, control group; VR, virtual reality; OSATS, Objective Structured Assessment of Technical Skills; GOALS, Global Operative Assessment of Laparoscopic Skills; c.i., confidence interval; OSA-LS, Objective Structured Assessment of Laparoscopic Salpingectomy; TEP, totally extraperitoneal; OR, operating room. Simulators: LapSim® VR simulator (Surgical Science, Gothenburg, Sweden); laparoscopic stimulator and Minimal Access Therapy Unit (MATTU) (Limbs and Things, Bristol, UK); Fundamentals of Laparoscopic Surgery (FLS) Training Box simulator (SAGES, Los Angeles, California, USA); Lap Mentor™ VR simulator (Simbionix, Cleveland, Ohio, USA); Minimally Invasive Surgical Trainer – Virtual Reality (MIST-VR™; Mentice, Gothenburg, Sweden); ProMIS™ surgical hybrid simulator (Haptica, Dublin, Ireland).

cantly higher than control participants in global performance for the assessment procedures. Eight18 – 21,23 – 25,27 of these studies compared simulator-trained participants with controls who did not have this training. Two studies29,30 compared the performance of participants who had simulation training, as part of a comprehensive curriculum, in addition to residency training with participants who had only conventional residency training. In addition to comparing simulator-trained participants with controls without simulation training, two studies19,24 also compared participants trained in differing simulator modalities. One study19 reported that simulator-trained participants scored significantly higher in the GOALS assessment scores than those with no simulator training and found no significant difference between groups trained with different simulators (McGill Inanimate System for Training and Evaluation of Laparoscopic Skills (MISTELS) – Fundamentals of Laparoscopic Surgery (FLS); LAP Mentor™, Simbionix, Cleveland, Ohio, USA). A study24 comprising a randomized arm

(comparing simulation-trained groups) and a nonrandomized arm (comparing simulation-trained groups with control group), which compared intraoperative performance for intracorporeal knot-tying of participants trained using a LapSim® virtual reality (VR) simulator (Surgical Science, Gothenburg, Sweden) or a laparoscopic box trainer and a control group without simulation training, found no significant difference between the VR-trained group and the box-trained group using the OSATS global rating scale. Both ex vivo groups performed significantly better than the control group. Participants who had not had simulation training achieved the proficiency level equivalent to that of simulator-trained participants after six repetitions in the OR. One RCT22 reported no difference in the objective assessment of technical skills in the OR for cholecystectomy between simulator-trained participants and controls who had no simulation training in any of the five GOALS domains (Table 1).

© 2014 BJS Society Ltd Published by John Wiley & Sons Ltd

www.bjs.co.uk

BJS 2014; 101: 1063–1076

1068

Table 2

S. R. Dawe, G. N. Pena, J. A. Windsor, J. A. J. L. Broeders, P. C. Cregan, P. J. Hewett and G. J. Maddern

Endoscopic procedures

Reference and LOE

Participants and simulators

Simulation training versus no simulation training Deutschmann et al.31 Medical students and junior (2013) residents on Canada otolaryngology rotation RCT II IG 32; low-fidelity transnasal fibreoptic flexible laryngoscopy simulator CG 33 Ferlitsch et al.32 (2010) Austria RCT II

Residents in internal medicine (at least 3rd year residents)

Assessment procedure

Transnasal fibreoptic flexible laryngoscopy

No significant differences in any quantitative or qualitative measures between groups. (P varied from 0·36 for time to visualization of glottis to 1·0 for patient willingness to repeat examination). Two-way ANOVA with interaction used to explore effect of both simulation and endoscopy. No additional benefit, above and beyond repeat endoscopy, was conferred by use of simulator (P varied from 0·10 for time to visualization of glottis to 0·92 for patient comfort assessed by investigator)

Oesophagogastroduodenoscopy

IG took less time to reach duodenum (P < 0·001) and to complete examination (P = 0·012) and had better technical accuracy (P < 0·02) than CG in the first 10 endoscopic examinations in patients. Diagnostic accuracy did not differ between groups. At advanced training (51st to 60th examination) investigation times were still shorter in IG (P < 0·003 for time to reach duodenum and total time). Technical and diagnostic accuracy differences between groups no longer found. No significant differences in discomfort and pain scores between groups for first 10 and for 51st to 60th endoscopies IG faster than CG for mucosal injection time (P = 0·003) and dissection time (P < 0·001). IG had fewer injection errors (P = 0·048), greater surgical confidence during dissection (P = 0·009) and higher level of dexterity with instrument manipulation (P = 0·011) than CG. CG had lower average error count per min during navigation (P = 0·032) Analysis of effect of simulation done by comparing change in skills for each participant. IG had longer operating time than CG (P = 0·025). No significant difference for all other parameters. Greater number of participants improved after simulation practice compared with those without simulation practice (P = 0·021) No significant difference between groups in flexible laryngoscopy procedure time on standardized patient (P = 0·315) or discomfort score assigned by that patient (P = 0·448) IG scored significantly higher than CG in global rating of performance (P = 0·04). 1 resident from CG and 0 from IG independently reached the caecum in the allotted time. No critical flaws in either group

IG 14; GI Mentor™ VR endoscopy simulator CG 14

Fried et al.33 (2010) USA RCT II

Otolaryngology residents PGY 1–2 IG 12; ES3 Endoscopic Sinus Surgery Simulator CG 13

Endoscopic sinus surgery

¨ ¨ et al.34 (2010) Kallstr om Sweden RCT II

Urology residents IG 11; PelvicVision® VR simulator CG 12

Transurethral resection of prostate

Ossowski et al.35 (2008) USA RCT II

Medical students IG 10; nasal model endoscopic simulator CG 10 General surgery and internal medicine residents PGY 1–3 IG 12: AccuTouch® VR colonoscopy simulator CG 12 Interns IG 50; URO Mentor™ VR endourological simulator CG 50 First year gastroenterology fellows IG 4; GI Mentor™ II VR endoscopy simulator CG 4

Flexible laryngoscopy on a single standardized patient

Medical residents PGY 1 or 2 IG 10; GI-Mentor II VR™ endoscopic simulator CG 10

Oesophagogastroduodenoscopy

Park et al.36 (2007) Canada RCT II

Schout et al.37 (2010) The Netherlands RCT II Sedlack38 (2007) USA RCT II

Shirai et al.39 (2008) Japan RCT II

© 2014 BJS Society Ltd Published by John Wiley & Sons Ltd

Results

Colonoscopy

Flexible cystourethroscopy

Oesophagogastroduodenoscopy

IG scored higher than CG in global rating scale (P < 0·01). IG scored higher in all 5 domains (respect for tissue, time and motion, handling endoscope, flow of procedure, and knowledge of procedure) CG attained superior scores in recognizing patient discomfort than CG (P = 0·015); no significant difference in all other scores. On days 1–5 of patient-based evaluation, median performance scores were significantly better in CG than IG for use of sedation (P = 0·019) and recognizing patient discomfort (P < 0·005). During days 6–10, CG had higher ratings than IG in independence (P = 0·033) and competence (P = 0·009). By days 11–15, no significant difference in performance ratings between groups IG scores significantly higher than CG scores for 5 of 11 tasks investigated: insertion of endoscope into oesophagus (P < 0·5), passing OGJ (P < 0·01), passing through pyloric ring (P < 0·05), examination of duodenal bulb (P < 0·05) and viewing fornix (P < 0·05). Direct assistance by supervisor required significantly less for IG than CG (P = 0·002). No difference in total time between groups

www.bjs.co.uk

BJS 2014; 101: 1063–1076

Skills transfer after surgical simulation-based training

Table 2

1069

Continued

Reference and LOE

Participants and simulators

Assessment procedure

Yi et al.40 (2008) South Korea Non-RCT III-2

6 fellows and 5 residents IG 5; KAIST-Ewha Colonoscopy Simulator II CG 6

Colonoscopy

Results

IG significantly outperformed CG in terms of insertion time (P = 0·028), success rate (P = 0·006), number of red-outs (P = 0·002), number of air inflations (P = 0·043), mucosal visualization (P = 0·002) and overall performance accuracy (P < 0·001). No statistical difference in numbers of loop formations, abdominal pressure applications, and changes in patient’s posture. Less anal discomfort in IG than CG (P = 0·002) Simulation training versus patient-based training versus simulation plus patient-based training Ende et al.41 (2012) Medicine and surgery OesophagogastroIG1 needed less time to intubate the oesophagus than other groups Germany residents duodenoscopy (P = 0·02). Visualized surface during procedure similar in all RCT II groups (P = 0·211). Skills score IG1 better than for IG2 (P = 0·035 IG1 (clinical plus simulator and P = 0·004 respectively for blind and unblinded assessment); training) 10; plastic no statistical difference for comparison of these groups with CG phantom, compactEASIE® and GI Mentor™ IG2 (simulator training) 9; simulator training as for IG1, but no clinical training in endoscopy CG (clinical training alone) 9 Simulation training versus patient-based training Fried et al.42 (2012) Otorrhinolaryngology Endoscopic sinus No significant difference between groups in initial (pretraining) USA residents PGY 1–3 surgery scores. IG had statistically significant better scores in all final IG 8; ES3 Endoscopic Sinus Non-RCT III-2 (post-training) navigation tasks (P not reported) Surgery Simulator CG 6 Haycock et al.43 Novice colonoscopists Colonoscopy No significant differences between simulator- and patient-trained (2010) (physicians, surgeons, groups in case completion (P = 0·51), maximum tip position UK nurses) (P = 0·73), time taken (P = 0·11), straight insertion depth IG 18; Olympus colonoscopy RCT II (P = 0·35), JAG DOPS score (P = 0·92) and global score simulator Endo TS-1 (P = 0·35) CG 18

LOE, level of evidence according to National Health and Medical Research Council of Australia15 ; RCT, randomized clinical trial; IG, intervention group; CG, control group; VR, virtual reality; PGY, postgraduate year; OGJ, oesophagogastric junction; JAG DOPS, UK Joint Advisory Group on Gastrointestinal Endoscopy Direct Observation of Procedural Skills. Simulators: GI Mentor™ VR endoscopic simulator and URO Mentor™ VR endourological simulator (Simbionix, Cleveland, Ohio, USA); ES3 Endoscopic Sinus Surgery Simulator (Lockheed Martin, Akron, Ohio, USA); ¨ PelvicVision® VR simulator (Melerit Medical, Linkoping, Sweden); AccuTouch® colonoscopy VR simulator (CAE Healthcare, Montreal, Quebec, Canada); KAIST-Ewha Colonoscopy Simulator II (Korea Advanced Institute of Science and Technology– Ewha Woman’s University, Seoul, South Korea); compact Erlangen Active Simulator for Interventional Endoscopy (compactEASIE®; ECETraining, Erlangen, Germany); Endo TS-1 colonoscopy simulator (Olympus KeyMed, Southend-on-Sea, UK).

Endoscopic procedures (7 studies) Two RCTs36,37 reported higher global rating scores for simulator-trained participants than participants with no simulation training for colonoscopy36 and cystourethroscopy37 . Yi et al.40 reported higher overall performance accuracy during colonoscopy for participants who received simulation training. Three studies34,41,43 reported no overall score difference between a simulation-trained group and a control group for endoscopic procedures, although in two of these41,43 the control group was made up of participants who had patientbased training, suggesting that simulation had provided an equivalent skills transfer. One of the above41 , which compared skills scores for oesophagogastroduodenoscopy, found no difference between participants who received simulation-based training (either alone or in addition to

patient-based training) and those who had patient-based training alone. A further study38 assessed competence in performing oesophagogastroduodenoscopy and reported no overall difference between groups. However, during patient-based assessment on days 6–10, participants who did not receive simulation-based training outperformed those with training (Table 2).

© 2014 BJS Society Ltd Published by John Wiley & Sons Ltd

www.bjs.co.uk

Other procedures (4 studies) Three RCTs46,47,49 reported that simulation-based training before patient-based assessment resulted in a higher score than no training using global rating scales for diagnostic knee arthroscopy46 , abdominal fascial closure49 and superficial femoral artery angioplasty47 . In one study investigating cardiac catheterization44 , simulator-trained participants achieved a greater improvement in global rating score compared with the control group, but the BJS 2014; 101: 1063–1076

1070

Table 3

S. R. Dawe, G. N. Pena, J. A. Windsor, J. A. J. L. Broeders, P. C. Cregan, P. J. Hewett and G. J. Maddern

Other procedures

Reference and LOE

Participants and simulators

Simulation training versus no simulation training Cardiology trainees Bagai et al.44 (2012) Canada IG 11; VIST® VR simulator RCT II CG 15 Belyea et al.45 (2011) USA Non-RCT III-3

Howells et al.46 (2008) UK RCT II Hseino et al.47 (2012) Ireland RCT II McCannel et al.48 (2013) USA Non-RCT III-3

Palter et al.49 (2011) Canada RCT II

Pokroy et al.50 (2013) USA Non-RCT III-3

3rd year ophthalmology residents IG 17; Eyesi VR surgical simulator CG 25 Junior orthopaedic surgeons IG 10; Sawbones® arthroscopic knee simulator CG 10 1st year general surgery registrars IG 5; VIST® VR simulator CG 5 Ophthalmology residents (evaluation of procedures performed before and after introduction of CITC) IG 23; Eyesi VR surgical simulator CG 25 General surgery and gynaecology residents PGY 1 IG 9; synthetic abdominal wall model CG 9 Ophthalmology residents (evaluation of procedures performed before and after introduction of simulation training) IG 10; Eyesi VR surgical simulator CG 10

Assessment procedure

Diagnostic cardiac catheterization

Phacoemulsification for cataract surgery

Results

IG had greater change in technical performance score, assessed by task-specific checklist from baseline to 1 week, compared with CG (P = 0·04). IG had greater improvement in global rating score than CG, but difference not significant (P = 0·11) IG had lower mean phaco time (P = 0·002), phaco power (P < 0·001) and adjusted phaco time (P < 0·001) than CG. No difference between groups in complication rate or grade

Diagnostic knee arthroscopy

IG had higher proportion of satisfactory scores on OCAP checklist (P < 0·007) and scored higher on OSATS global rating scale (P = 0·001)

Superficial femoral artery angioplasty

IG scored higher than CG on procedure-specific checklist (P = 0·001) and 12-item global rating scale (P = 0·003)

Phacoemulsification for cataract surgery

IG had lower proportion of errant CCC than CG (P < 0·001). Proportion of errant CCC also analysed by simulator use during whole study period. Absence of Eyesi use or Eyesi use in an unstructured manner had simular errant CCCs (P = 0·75). Structured use of Eyesi with proficiency curriculum resulted in significantly lower rates of errant CCC compared with either of the aforementioned situations (P < 0·001) IG had higher OSATS score than CG (P = 0·04). Cognitive skills higher for IG than CG (P = 0·03)

Abdominal fascial closure while listening to a script

Phacoemulsification for cataract surgery

No statistically significant differences between groups in mean operating time (P = 0·24) and number of posterior capsule ruptures (P = 0·63). IG had shorter mean surgical times than CG beyond the first 10 procedures (P = 0·005)

LOE, level of evidence according to National Health and Medical Research Council of Australia15 ; RCT, randomized clinical trial; IG, intervention group; VR, virtual reality; CG, control group; OCAP, Orthopaedic Competence Assessment Project; OSATS, Objective Structured Assessment of Technical Skills; CITC, capsulorrhexis intensive training curriculum; CCC, continuous curvilinear capsulorrhexis; PGY, postgraduate year. Simulators: VIST® VR vascular intervention simulation trainer (Mentice, Gothenburg, Sweden); Eyesi VR surgical simulator (VR Magic, Mannheim, Germany); ¨ Sweden). Sawbones® arthroscopic knee bench-top simulator (Sawbones, Malmo,

difference did not reach statistical significance. Hseino and colleagues47 and Bagai et al.44 also assessed participants using a procedure-specific checklist; the difference between groups was significant in favour of the participants who trained on the vascular intervention VR simulator (Table 3).

Studies reporting performance time Eighteen17,20,23,24,26 – 28,31 – 35,39 – 41,43,45,50 of the 34 studies reported performance time, defined as time taken in © 2014 BJS Society Ltd Published by John Wiley & Sons Ltd

minutes or seconds to conduct a patient-based assessment procedure/task.

Laparoscopic procedures (7 studies) Three RCTs23,26,27 and one observational study24 reported that the simulator-trained participants completed laparoscopic procedure/tasks in the OR in significantly less time than those with no previous simulation training. One study28 found no significant difference between simulator-trained and patient-trained participants in time taken to complete post-training camera navigation. Participants in both groups spent equal time training www.bjs.co.uk

BJS 2014; 101: 1063–1076

Skills transfer after surgical simulation-based training

on camera navigation, although the patient-trained participants spent significantly more overall time in the OR than the simulator-trained participants spent in the skills laboratory/OR. The authors concluded that traditional training in the OR was not as time-efficient as simulatorbased training. Two studies17,20 reported that simulator-trained participants performed a cholecystectomy task in the OR in less time than control participants, although this was not statistically significant (Table 1).

Endoscopic procedures (9 studies) Two RCTs32,33 and one observational study40 reported that simulator-trained participants completed endoscopic procedure/tasks in significantly less time than participants with no simulation training. One study32 found that simulator-trained participants performed upper gastrointestinal endoscopy significantly faster than participants without training in their first ten examinations and that this difference persisted at the advanced training stage (51st to 60th patient examinations). Ende and co-workers41 reported no significant difference in overall time to perform an oesophagogastrodenoscopy between simulator-trained participants, participants who had received patient-based trained and those who had received simulator- plus patient-based training (both interventions). Those who received both interventions, however, required significantly less time to intubate the oesophagus than those who only had one type of training. One study43 found no significant difference in time taken to perform colonoscopy between simulation-trained participants and patient-trained participants. Three studies31,35,39 reported no significant difference in time taken for flexible transnasal laryngoscopy31,35 and oesophagogastrodenoscopy39 between participants with simulation training and those without. One study34 reported a significant difference in transurethral prostate resection performance time for the first procedure after baseline assessment, in favour of those without simulation training (Table 2). Other procedures (2 studies) Two non-randomized studies45,50 reported that simulatortrained participants completed phacoemulsification for cataract surgery in significantly less time than participants with no simulator training (Table 3). Studies reporting success rate Success rate was defined as either the percentage of participants who were able to complete the patientbased assessment as specified in the methods, the number © 2014 BJS Society Ltd Published by John Wiley & Sons Ltd

1071

of participants who were able to complete the case independently without assistance from the supervising surgeon, or the number of participants who were given a pass grade. Thirteen studies18,22,25,29,32,34,36,38 – 41,43,47 reported success rates.

Laparoscopic procedures (4 studies) One RCT18 reported that all of the simulation-trained group passed the patient-based assessment procedure for laparoscopic bilateral tubal ligation compared with only 30 per cent of the control group without simulation training. Another study29 reported that participants who received simulation-based training as part of a comprehensive curriculum were able to perform significantly more operative steps during a right colectomy than conventionally trained personnel. ‘Autonomy’ was included in most objective global assessment scores but the result was reported separately by only two RCTs22,25 for laparoscopic cholecystectomy, and no significant difference was found between simulationtrained participants and those with no simulation training (Table 1). Endoscopic procedures (8 studies) Two RCTs32,39 and one observational study40 reported that significantly more simulator-trained participants completed the endoscopic procedure/task, or needed less assistance, compared with those with no simulator training. In one of these studies32 simulator-trained participants had a better intubation rate, needing less assistance during the first ten oesophagogastroduodenoscopies. This difference was no longer detectable after 60 examinations. A further study36 found that only one participant from the simulatortrained group reached the caecum whereas none from the control group without simulation training completed the task during colonoscopy. Two studies41,43 compared simulation-based with patient-trained participants, and no difference in success rate was found. Haycock et al.43 reported that six participants in the simulator-trained group and four in the patient-trained group reached the caecum, which was not significantly different. Ende et al.41 found no difference in the estimated portion of mucosa visualized between participants trained using the simulator and those who received only patient-based training or those who received both interventions. Two studies34,38 reported no overall difference between intervention and control groups (with no simulation training) regarding independence in performing oesophagogastroduodenoscopy38 , and progress rate, successful resection, successful haemostasis and successful orientation during a TURP procedure34 (Table 2). www.bjs.co.uk

BJS 2014; 101: 1063–1076

1072

S. R. Dawe, G. N. Pena, J. A. Windsor, J. A. J. L. Broeders, P. C. Cregan, P. J. Hewett and G. J. Maddern

Other procedures (1 study) One study47 reported that, as part of the global score, participants who received simulation training using the VR vascular simulator had higher scores in the items ‘ability to complete the case’ and ‘attending takeover’ during superficial femoral angioplasty (Table 3). Studies reporting performance errors Performance errors, described as movements or events outside the normal procedure, were reported by six RCTs17,20,26,31 – 33 and one non-randomized study48 . Three27,45,50 described perioperative or postoperative complications and are also described below.

Laparoscopic procedures (4 studies) Three RCTs17,20,26 reported that simulator-trained participants made significantly fewer intraoperative errors than those not trained on the simulator. One study27 found that simulator training was associated with lower intraoperative and postoperative complication rates for the first total extraperitoneal hernia repair performed after training (Table 1). Endoscopic procedures (3 studies) One RCT33 reported that simulator-trained participants made fewer mucosal injection errors during endoscopic sinus surgery compared with control participants. Conversely, one RCT31 reported no difference between these groups relating to the number of collisions with mucosa during transnasal flexible laryngoscopy, and another32 found no difference in the number of missed findings during oesophagogastroduodenoscopy between simulationtrained participants and controls (Table 2). Other procedures (3 studies) Three comparative studies45,48,50 examined the performance of ophthamology residents in cataract surgery before and after simulation-based training. One study48 found a significant reduction in the number of errant capsulorrhexes after the introduction of simulation-based training, whereas the other two found no difference in intraoperative complications45 , posterior capsule rupture45 or anterior vitrectomy50 between these groups (Table 3). Studies reporting patient discomfort Patient discomfort was described as the pain felt by the patient undergoing the assessment procedure. Four studies31,32,35,40 investigating endoscopic procedures reported patient pain or discomfort. © 2014 BJS Society Ltd Published by John Wiley & Sons Ltd

Two32,40 reported patient discomfort scores, for oesophagosgastroduodenoscopy and colonoscopy respectively. In one study32 , patient pain and discomfort did not differ significantly between procedures carried out by those who received simulation training and those who did not. In the other40 , the simulator-trained group recorded a non-significant lower patient discomfort score for abdominal pain with a significant reduction in anal discomfort. Two RCTs31,35 investigated patient discomfort during transnasal laryngoscopy, and both reported no difference between the simulator-trained participants and those without training (Table 2). One study38 did not directly assess patient discomfort but evaluated participants’ ability to recognize and respond adequately to patient discomfort. Participants who did not have simulation training received superior scores in this parameter than those with simulation training. Discussion

A number of benefits might be derived from the successful acquisition of skills in a surgical simulation setting before intraoperative patient contact1 . With high-level skills transfer there may be improved patient safety and procedure efficiency, associated with cost savings51 . Since the original systematic review6 the evidence base related to simulation-based skills training has expanded greatly with 27 new RCTs. This review supports the hypothesis that simulationbased training has advantages over no training. Of the 28 studies17 – 27,31 – 40,44 – 50 that made this comparison, only five22,31,34,35,38 found that simulator-trained participants had no better performance than their peers without simulation training in any of the parameters assessed. The evidence for the value of simulation training compared with patient-based training is weaker. Only four studies28,41 – 43 , involving 92 participants, reported this comparison. No study found the simulator-trained group to have significantly poorer performance than a conventional patient-trained group in any of the parameters. Two studies29,30 compared participants who received simulation training as part of a comprehensive curriculum along with those undergoing a standard training programme. In both, participants who completed the comprehensive curriculum had better performance, although it is not clear whether, and to what extent, the control group had simulation training during their conventional training programme. It is also not known what degree of improvement in the intervention groups might be attributed specifically to simulation, as the comprehensive curriculum also www.bjs.co.uk

BJS 2014; 101: 1063–1076

Skills transfer after surgical simulation-based training

1073

contained a cognitive component29,30 , cadaver laboratory setting29 and OR participation30 . The studies reviewed encompass a wide range of training procedures, types of simulation, study designs, and methods of objective skills assessment. This heterogeneity might be seen as a strength of this review, providing the opportunity to evaluate a generic concept (‘skills transfer’) across a number of settings using a range of complementary and validated tools. As the majority of studies were randomized, the influence of confounders is largely mitigated and the range of procedures increases generalizability. On the other hand, inconsistencies between studies made most comparisons scientifically invalid. Many factors determine whether skills can be transferred successfully, including those that relate to simulator design and functionality, the way that simulators are used as a training tool, the extent of prelearning, inherent learning style, the nature and type of feedback, as well as opportunities for reinforcement of learning. Consequently, the evidence for skills transfer cannot be attributed to the use of simulators alone. The present review did not focus on specific skills learnt via simulation-based training. Studies predominantly assessed technical skills, with less focus on non-technical skills, such as communication and decision-making, that are also required for the performance of a procedure in a patient-based setting5,52 – 54 . None of the studies assessed participants’ non-technical skills in the OR using a specific validated tool, such as Non-Technical Skills for Surgeons (NOTSS)54 and revised Non-technical skills (revised NOTECHS)55 . It is also important to note that no single parameter measured can by itself demonstrate that a trainee has acquired an expert level of proficiency or competence56 . A good example of this is performance time, which was measured in 18 of the included studies17,20,23,24,26 – 28,31 – 35,39 – 41,43,45,50 . Although more rapid task completion is a recognized feature of expert performance, measurement of this variable alone does not give an indication of the quality of the task performed, and caution should be taken when interpreting it without any additional objective quality data. It was evident that the reporting of methodological detail was often incomplete; this applied to the method and implementation of randomization (described in full in only 3 studies21,23,27 ), and power calculations (described in full in only 15 studies21,25,27 – 32,34,36,37,41,43,44,49 ). Sample sizes were small. Only four RCTs21,27,31,37 had samples of more than 25 participants per group. Overall, the studies had the potential for increased type I error as simple statistical analyses were generally used, and there were many tests

of significance examined within each study, often over a number of procedures or assessments. Another variation between studies was in the duration and intensity of simulation-based training. The endpoints of training were inconsistent, making it difficult to compare the skill level at the end of the training. As evidence of an improvement in the quality of study design since the original systematic review of skills transfer6 , 19 studies17,20 – 27,29,30,33,34,40,42,44,47 – 49 used a predefined measure of proficiency on the simulator to determine the skill level required/desired at the end of training. The use of predefined measures of proficiency is important as simulation, by itself, is not a guarantee that adequate learning will occur57 . Any successful use of training technology (including simulation) must begin with clearly defined educational objectives58 . Although there has been an increase in the evidence base in support of skills transfer from simulation-based training to the patient-based setting, there remain a number of issues that need to be addressed. Methodological completeness and reporting should follow the Consolidated Standards of Reporting Trials (CONSORT) guidelines59 . Consistency in training and assessment methods across studies would help provide further insight into the benefits and relative cost of surgical simulation-based training. Exactly how simulation-based training should be integrated into the training curricula in a cost-effective manner for different specialties, exploration of other important dimensions of skills transfer including the nature, intensity, stage and duration of training, and the type of simulator device required to deliver the greatest transfer effect, still need to be clarified. The effect of different levels of mentorship and the types of briefing/debriefing during the training period on skill transfer efficacy should also be evaluated, along with measurements of staff productivity and systems improvement resulting from skills transfer. Future studies should explore how training and assessment of both technical and non-technical dimensions of skills can best be performed concurrently. Despite these limitations, simulation-based training provides a safe, effective and ethical way to acquire skills.

© 2014 BJS Society Ltd Published by John Wiley & Sons Ltd

www.bjs.co.uk

Acknowledgements

This Systematic Review Update is based on a protocol established in the Australian Safety and Efficacy Register of New Interventional Procedures – Surgical (ASERNIPS) Systematic Review Report number 61, which can be found on the Royal Australasian College of Surgeons website (http:/www.surgeons.org/asernip-s). The ASERNIPS project is funded by the Australian Commonwealth BJS 2014; 101: 1063–1076

1074

S. R. Dawe, G. N. Pena, J. A. Windsor, J. A. J. L. Broeders, P. C. Cregan, P. J. Hewett and G. J. Maddern

Department of Health and Ageing and the South Australian Department of Health. The ASERNIP-S project receives no commercial sponsorship. J.A.W. is a Director of Simtic Ltd. Disclosure: The authors declare no other conflict of interest. References 1 Gaba DM. The future vision of simulation in health care. Qual Saf Health Care 2004; 13(Suppl 1): i2–ii10. 2 Sturm LP, Windsor JA, Cosman PH, Cregan P, Hewett PJ, Maddern GJ. A systematic review of skills transfer after surgical simulation training. Ann Surg 2008; 248: 166–179. 3 Satava RM. Emerging trends that herald the future of surgical simulation. Surg Clin North Am 2010; 90: 623–633. 4 Gallagher AG, Ritter EM, Champion H, Higgins G, Fried MP, Moses G et al. Virtual reality simulation for the operating room: proficiency-based training as a paradigm shift in surgical skills training. Ann Surg 2005; 241: 364–372. 5 Pugh C, Plachta S, Auyang E, Pryor A, Hungness E. Outcome measures for surgical simulators: is the focus on technical skills the best approach? Surgery 2010; 147: 646–654. 6 Sturm L, Cosman P, Hewett P, Cregan P, Windsor J, Maddern G. Surgical Simulation for Training: Skills Transfer to the Operating Room. ASERNIP-S Report No. 61. ASERNIP-S, Adelaide, 2007. 7 Moher D, Tsertsvadze A. Systematic reviews: when is an update an update? Lancet 2006; 367: 881–883. 8 Aggarwal R, Darzi A. Technical-skills training in the 21st century. N Engl J Med 2006; 355: 2695–2696. 9 Reznick R, MacRae H. Teaching surgical skills – changes in the wind. N Engl J Med 2006; 355: 2664–2669. 10 Debas H, Bass B, Brennan M, Flynn T, Folse J, Freischlag J et al.; American Surgical Association Blue Ribbon Committee. American Surgical Association Blue Ribbon Committee Report on Surgical Education: 2004. Ann Surg 2005; 241: 1–8. 11 Scott DJ, Dunnington GL. The new ACS/APDS Skills Curriculum: moving the learning curve out of the operating room. J Gastrointest Surg 2008; 12: 213–221. 12 Vassiliou MC, Feldman LS, Andrew CG, Bergman S, Leffondr´e K, Stanbridge D et al. A global assessment tool for evaluation of intraoperative laparoscopic skills. Am J Surg 2005; 190: 107–113. 13 Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative ‘bench station’ examination. Am J Surg 1997; 173: 226–230. 14 Cook DJ, Mulrow CD, Haynes RB. Systematic reviews: synthesis of best evidence for clinical decisions. Ann Intern Med 1997; 126: 376–380. 15 National Health and Medical Research Council (NHMRC). How to Use the Evidence: Assessment and Application of Scientific Evidence; 2000. http://www.nhmrc.gov.au [accessed 1 September 2012]. © 2014 BJS Society Ltd Published by John Wiley & Sons Ltd

16 The Cochrane Collaboration. Cochrane Handbook for Systematic Reviews of Interventions Version 5.1.0 [updated March 2011]. http://www.cochrane.org/resources/handbook [accessed 1 September 2012]. 17 Ahlberg G, Enochsson L, Gallagher AG, Hedman L, Hogman C, McClusky DA III et al. Proficiency-based virtual reality training significantly reduces the error rate for residents during their first 10 laparoscopic cholecystectomies. Am J Surg 2007; 193: 797–804. 18 Banks EH, Chudnoff S, Karmin I, Wang C, Pardanani S. Does a surgical simulator improve resident operative performance of laparoscopic tubal ligation? Am J Obstet Gynecol 2007; 197: 541.e1–5. 19 Beyer L, Troyer JD, Mancini J, Bladou F, Berdah SV, Karsenty G. Impact of laparoscopy simulator training on the technical skills of future surgeons in the operating room: a prospective study. Am J Surg 2011; 202: 265–272. 20 Cosman PH, Hugh TJ, Shearer CJ, Merrett ND, Biankin AV, Cartmill JA. Skills acquired on virtual reality laparoscopic simulators transfer into the operating room in a blinded, randomised, controlled trial. Stud Health Technol Inform 2007; 125: 76–81. 21 Gala R, Orejuela F, Gerten K, Lockrow E, Kilpatrick C, Chohan L et al. Effect of validated skills simulation on operating room performance in obstetrics and gynecology residents: a randomized controlled trial. Obstet Gynecol 2013; 121: 578–584. 22 Hogle NJ, Chang L, Strong VE, Welcome AO, Sinaan M, Bailey R et al. Validation of laparoscopic surgical skills training outside the operating room: a long road. Surg Endosc 2009; 23: 1476–1482. 23 Larsen CR, Soerensen JL, Grantcharov T, Dalsgaard T, Schouenborg L, Ottosen C et al. Effect of virtual reality training on laparoscopic surgery: randomised controlled trial. BMJ 2009; 338: b1802. 24 Orzech N, Palter VN, Reznick RK, Aggarwal R, Grantcharov TP. A comparison of 2 ex vivo training curricula for advanced laparoscopic skills: a randomized controlled trial. Ann Surg 2012; 255: 833–839. 25 Sroka G, Feldman LS, Vassiliou MC, Kaneva PA, Fayez R, Fried GM. Fundamentals of laparoscopic surgery simulator training to proficiency improves laparoscopic performance in the operating room – a randomized controlled trial. Am J Surg 2010; 199: 115–120. 26 Van Sickle KR, Ritter EM, Baghai M, Goldenberg AE, Huang IP, Gallagher AG et al. Prospective, randomized, double-blind trial of curriculum-based training for intracorporeal suturing and knot tying. J Am Coll Surg 2008; 207: 560–568. 27 Zendejas B, Cook DA, Bingener J, Huebner M, Dunn WF, Sarr MG et al. Simulation-based mastery learning improves patient outcomes in laparoscopic inguinal hernia repair: a randomized controlled trial. Ann Surg 2011; 254: 502–509. 28 Franzeck FM, Rosenthal R, Muller MK, Nocito A, Wittich F, Maurus C et al. Prospective randomized controlled trial of

www.bjs.co.uk

BJS 2014; 101: 1063–1076

Skills transfer after surgical simulation-based training

29

30

31

32

33

34

35

36

37

38

39

40

41

simulator-based versus traditional in-surgery laparoscopic camera navigation training. Surg Endosc 2012; 26: 235–241. Palter VN, Grantcharov TP. Development and validation of a comprehensive curriculum to teach an advanced minimally invasive procedure: a randomized controlled trial. Ann Surg 2012; 256: 25–32. Palter VN, Orzech N, Reznick RK, Grantcharov TP. Validation of a structured training and assessment curriculum for technical skill acquisition in minimally invasive surgery: a randomized controlled trial. Ann Surg 2013; 257: 224–230. Deutschmann MW, Yunker WK, Cho JJ, Andreassen M, Beveridge S, Bosch JD. Use of a low-fidelity simulator to improve trans-nasal fibre-optic flexible laryngoscopy in the clinical setting: a randomized, single-blinded, prospective study. J Otolaryngol Head Neck Surg 2013; 42: 35. Ferlitsch A, Schoefl R, Puespoek A, Miehsler W, Schoeniger-Hekele M, Hofer H et al. Effect of virtual endoscopy simulator training on performance of upper gastrointestinal endoscopy in patients: a randomized controlled trial. Endoscopy 2010; 42: 1049–1056. Fried MP, Sadoughi B, Gibber MJ, Jacobs JB, Lebowitz RA, Ross DA et al. From virtual reality to the operating room: the endoscopic sinus surgery simulator experiment. Otolaryngol Head Neck Surg 2010; 142: 202–207. ¨ R, Hjertberg H, Svanvik J. Impact of virtual K¨allstrom reality-simulated training on urology residents’ performance of transurethral resection of the prostate. J Endourol 2010; 24: 1521–1528. Ossowski KL, Rhee DC, Rubinstein EN, Ferguson BJ. Efficacy of sinonasal simulator in teaching endoscopic nasal skills. Laryngoscope 2008; 118: 1482–1485. Park J, MacRae H, Musselman LJ, Rossos P, Hamstra SJ, Wolman S et al. Randomized controlled trial of virtual reality simulator training: transfer to live patients. Am J Surg 2007; 194: 205–211. Schout BM, Ananias HJ, Bemelmans BL, d’Ancona FC, Muijtjens AM, Dolmans VE et al. Transfer of cysto-urethroscopy skills from a virtual-reality simulator to the operating room: a randomized controlled trial. BJU Int 2010; 106: 226–231. Sedlack R. Validation of computer simulation training for esophagogastroduodenoscopy: pilot study. J Gastroenterol Hepatol 2007; 22: 1214–1219. Shirai Y, Yoshida T, Shiraishi R, Okamoto T, Nakamura H, Harada T et al. Prospective randomized study on the use of a computer-based endoscopic simulator for training in esophagogastroduodenoscopy. J Gastroenterol Hepatol 2008; 23: 1046–1050. Yi SY, Ryu KH, Na YJ, Woo HS, Ahn W, Kim WS et al. Improvement of colonoscopy skills through simulation-based training. Stud Health Technol Inform 2008; 132: 565–567. Ende A, Zopf Y, Konturek P, Naegel A, Hahn EG, Matthes K et al. Strategies for training in diagnostic upper endoscopy: a prospective, randomized trial. Gastrointest Endosc 2012; 75: 254–260.

© 2014 BJS Society Ltd Published by John Wiley & Sons Ltd

1075

42 Fried MP, Kaye RJ, Gibber MJ, Jackman AH, Paskhover BP, Sadoughi B et al. Criterion-based (proficiency) training to improve surgical performance. Arch Otolaryngol Head Neck Surg 2012; 138: 1024–1029. 43 Haycock A, Koch AD, Familiari P, van Delft F, Dekker E, Petruzziello L et al. Training and transfer of colonoscopy skills: a multinational, randomized, blinded, controlled trial of simulator versus bedside training. Gastrointest Endosc 2010; 71: 298–307. 44 Bagai A, O’Brien S, Al Lawati H, Goyal P, Ball W, Grantcharov T et al. Mentored simulation training improves procedural skills in cardiac catheterization: a randomized, controlled pilot study. Circ Cardiovasc Interv 2012; 5: 672–679. 45 Belyea DA, Brown SE, Rajjoub LZ. Influence of surgery simulator training on ophthalmology resident phacoemulsification performance. J Cataract Refract Surg 2011; 37: 1756–1761. 46 Howells NR, Gill HS, Carr AJ, Price AJ, Rees JL. Transferring simulated arthroscopic skills to the operating theatre: a randomised blinded study. J Bone Joint Surg Br 2008; 90: 494–499. 47 Hseino H, Nugent E, Lee MJ, Hill AD, Neary P, Tierney S et al. Skills transfer after proficiency-based simulation training in superficial femoral artery angioplasty. Simul Healthc 2012; 7: 274–281. 48 McCannel CA, Reed DC, Goldman DR. Ophthalmic surgery simulator training improves resident performance of capsulorhexis in the operating room. Ophthalmology 2013; 120: 2456–2461. 49 Palter VN, Grantcharov T, Harvey A, Macrae HM. Ex vivo technical skills training transfers to the operating room and enhances cognitive learning: a randomized controlled trial. Ann Surg 2011; 253: 886–889. 50 Pokroy R, Du E, Alzaga A, Khodadadeh S, Steen D, Bachynski B et al. Impact of simulator training on resident cataract surgery. Graefes Arch Clin Exp Ophthalmol 2013; 251: 777–781. 51 Bridges M, Diamond DL. The financial impact of teaching surgical residents in the operating room. Am J Surg 1999; 177: 28–32. 52 Weldon S, Korkiakangas T, Bezemer J, Kneebone R. Communication in the operating theatre. Br J Surg 2013; 100: 1677–1688. 53 Hull L, Arora S, Aggarwal R, Darzi A, Vincent C, Sevdalis N. The impact of nontechnical skills on technical performance in surgery: a systematic review. J Am Coll Surg 2012; 214: 214–230. 54 Flin R, Yule S, Paterson-Brown S, Maran N, Rowley D, Youngson G. Teaching surgeons about non-technical skills. Surgeon 2007; 5: 1098–1104. 55 Sevdalis N, Davis R, Koutantji M, Undre S, Darzi A, Vincent CA. Reliability of a revised NOTECHS scale for use in surgical teams. Am J Surg 2008; 196: 184–190. 56 Ahlberg G, Hultcrantz R, Jaramillo E, Lindblom A, Arvidsson D. Virtual reality colonoscopy simulation: a

www.bjs.co.uk

BJS 2014; 101: 1063–1076

1076

S. R. Dawe, G. N. Pena, J. A. Windsor, J. A. J. L. Broeders, P. C. Cregan, P. J. Hewett and G. J. Maddern

compulsory practice for the future colonoscopist? Endoscopy 2005; 37: 1198–1204. 57 Chiniara G, Cole G, Brisbin K, Huffman D, Cragg B, Lamacchia M et al.; Canadian Network For Simulation In Healthcare, Guidelines Working Group. Simulation in healthcare: a taxonomy and a conceptual framework for instructional design and media selection. Med Teach

2013; 35: e1380–e1395. 58 Gagn´e R, Wagner W, Golas K, Keller J. Principles of Instructional Design (5th edn). Wadsworth: Belmont, 2004. 59 Schulz KF, Altman DG, Moher D; CONSORT Group. CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. BMJ 2010; 340: c332.

Supporting information

Additional supporting information may be found in the online version of this article: Table S1 Laparoscopic procedures (Word document) Table S2 Endoscopic procedures (Word document) Table S3 Other procedures (Word document)

If you wish to comment on this, or any other article published in the BJS, please visit the on-line correspondence section of the website (www.bjs.co.uk). Electronic communications will be reviewed by the Correspondence Editor and a selection will appear in the correspondence section of the Journal.

© 2014 BJS Society Ltd Published by John Wiley & Sons Ltd

www.bjs.co.uk

BJS 2014; 101: 1063–1076

Systematic review of skills transfer after surgical simulation-based training.

Simulation-based training assumes that skills are directly transferable to the patient-based setting, but few studies have correlated simulated perfor...
584KB Sizes 0 Downloads 5 Views