ORIGINAL REPORTS

Evaluation of Two Flexible Colonoscopy Simulators and Transfer of Skills into Clinical Practice Pedro Pablo Gomez, MD, Ross E. Willis, PhD, and Kent Van Sickle, MD Department of Surgery, University of Texas Health Science Center at San Antonio, San Antonio, Texas INTRODUCTION: Surgical residents have learned flexible endoscopy by practicing on patients in hospital settings under the strict guidance of experienced surgeons. Simulation is often used to “pretrain” novices on endoscopic skills before real clinical practice; nonetheless, the optimal method of training remains unknown. The purpose of this study was to compare endoscopic virtual reality and physical model simulators and their respective roles in transferring skills to the clinical environment. METHODS: At the beginning of a skills development

rotation, 27 surgical postgraduate year 1 residents performed a baseline screening colonoscopy on a real patient under faculty supervision. Their performances were scored using the Global Assessment of Gastrointestinal Endoscopic Skills (GAGES). Subsequently, interns completed a 3-week flexible endoscopy curriculum developed at our institution. One-third of the residents were assigned to train with the GI Mentor simulator, one-third trained with the Kyoto simulator, and one-third of the residents trained using both simulators. At the end of their rotations, each postgraduate year 1 resident performed one posttest colonoscopy on a different patient and was again scored using GAGES by an experienced faculty. RESULTS: A statistically significant improvement in the

GAGES total score (p o 0.001) and on each of its subcomponents (p ¼ 0.001) was observed from pretest to posttest for all groups combined. Subgroup analysis indicated that trainees in the GI Mentor or both simulators conditions showed significant improvement from pretest to posttest in terms of GAGES total score (p ¼ 0.017 vs 0.024, respectively). This was not observed for those exclusively using the Kyoto platform (p ¼ 0.072). Nonetheless, no single training condition was shown to be a

Correspondence: Inquiries to Pedro Pablo Gomez, MD, University of Texas Health Science Center at San Antonio, Mail Code 7737, 7703 Floyd Curl Dr., San Antonio, TX 78229-3900; fax: (210) 567-2347; E-mail: [email protected], pedropablogmz@ hotmail.com

220

better training modality when compared to others in terms of total GAGES score or in any of its subcomponents. CONCLUSION: Colonoscopy simulator training with the GI

Mentor platform exclusively or in combination with a physical model simulator improves skill performance in real colonoscopy cases when measured with the GAGES tool. ( J Surg C 2014 Association of Program Directors in 72:220-227. J Surgery. Published by Elsevier Inc. All rights reserved.) KEY WORDS: flexible endoscopy, simulation, skills assess-

ment, GAGES COMPETENCIES: Practice-Based Learning and Improve-

ment, Systems-Based Practice, Patient Care

INTRODUCTION Approximately 2.8 million flexible sigmoidoscopies and 14.2 million colonoscopies were estimated to have been performed in 2002 for colorectal cancer screening.1 Traditionally, general surgery residents have learned flexible endoscopy techniques by practicing on actual patients under the strict guidance of experienced surgeons. Currently, the required number of endoscopy cases established by the Accreditation Council for Graduate Medical Education Residency Review Committee for Surgery (RRC-S) and endorsed by the American Board of Surgery (ABS) was put into debate by a position paper issued jointly by the American Society for Gastrointestinal Endoscopy and 3 other gastrointestinal (GI) societies.2 In response to their statement, the ABS expressed that hospital privileging for practicing endoscopists must go beyond an arbitrary number of procedures performed and should use objective criteria to evaluate technical and cognitive skills.3 New approaches to determine competency have been postulated, given the dissimilarity in the minimal number of cases suggested by different medical societies4 and the inadequacy of a number as a surrogate for measuring safety and proficiency during training or afterward.

Journal of Surgical Education  & 2014 Association of Program Directors in Surgery. Published by 1931-7204/$30.00 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.jsurg.2014.08.010

Wexner et al.,4 in the largest prospective study devoted to investigate the effect of numbers on competence in GI endoscopy, reported that general surgeons with an experience level of at least 50 lower GI endoscopy cases were able to reach the cecum more than 90% of the time, with procedure lengths of 30 minutes or less, and with minimum morbidity and mortality. Currently, general surgery residents require a minimum of 35 upper GI endoscopies and 50 colonoscopies before graduation as their standard of training.5 However, previously published numbers for GI fellows were set at 130 and 140 upper and lower endoscopies, respectively, to assess adequacy of skills.6 Recently, the crucial question in defining training standards has focused on the level at which safety and basic competence is achieved. The Society of American Gastrointestinal and Endoscopic Surgeons has validated a global rating scale called Global Assessment of Gastrointestinal Endoscopic Skills (GAGES) that has been shown to be an instrument to determine endoscopic technical skills in both gastroenterologists and surgeons.7 The GAGES-colonoscopy (GAGES-C) tool is based on a 5-point Likert rating scale with 5 domains that include scope navigation, strategies for scope advancement, clear field, instrumentation (when performed), and overall quality. Each one of these domains is scored from 1 to 5 where 1 is given when unable to complete the task, 3 requires some assistance, and 5 reaches level of proficiency. A possible maximum score of 25 is obtained when all 5 domains are included. If instrumentation was not performed, then the maximum attainable score is 20. Vassiliou et al.8 showed the use of this proficiency-based tool to be superior to case numbers by comparing GAGES scores among novices who performed less than 50 and 140 lower endoscopy cases based on the RRC-S and American Society for Gastrointestinal Endoscopy criteria, respectively. No statistically significant differences were found in GAGES scores among novices (p o 0.001). The same results were found when comparing experienced endoscopists in both groups (p o 0.001) taking into consideration the case-number criteria mentioned earlier. These findings suggest that the use of a proficiency measure such as GAGES may prove to be superior to case numbers as currently established. Exposure to flexible endoscopy among surgical residents remains inconsistent and uneven nationwide. Duty-hour limitations, lack of a formal endoscopy rotation, and the recent increase in case requirements by the RRC-S are some of the challenges program directors and educators face in this new era of surgical training. Strategies such as the development of a national cognitive and technical skills curricula based on adult learning theory9 and simulatorbased training protocols have been proposed by educators to fill this training gap.10-12 Simulation has become ubiquitous in surgical residency training programs and is often used as a method of “pretraining” residents on various surgical skills

before practicing on actual patients.13 Since the development of the first endoscopic simulator,14 significant technological advancements have allowed a transition from physical model simulators to more complex and advanced computer-based platforms.15-18 Despite these continuous developments, the effect of simulation training relies completely in its ability to transfer the learned skills into clinical practice. The purpose of this study was to compare 2 types of endoscopic simulators and their respective roles in transferring skills to the clinical environment.

MATERIALS AND METHODS In our general surgery program, postgraduate year (PGY)-1 residents complete an exclusive 1-month skills development rotation in our simulation laboratory. The participants in this institutional review board–approved study were 27 endoscopic novices (i.e., PGY-1 categorical and preliminary general surgery residents). At the beginning of their skills development rotations, each resident performed a baseline colonoscopy on a patient under the guidance of experienced faculty as is the current standard learning practice. Their performance was scored using the GAGES-C7 tool. Given that this was their first exposure to colonoscopy, we did not allow them to perform any therapeutic techniques which required instrumentation (e.g., snaring a polyp). Thus, we eliminated the instrumentation category as part of their scoring, which only allowed a maximum score of 20 rather than 25. In the case instrumentation needed to be performed, faculty took control of the case and returned the colonoscope to the PGY-1 resident once this portion of the procedure was completed. In the event that a PGY-1 resident was unable to make progress, the faculty took control of the case, advanced to a portion within the residentʼs ability, and returned the colonoscope to the resident. Trainees only performed colonoscopies on patients older than 18 years, scheduled to have an elective screening procedure, and with no prior history of any major intestinal or abdominal operations (e.g., colectomy, colostomy, or hysterectomy). After the baseline assessment, each trainee completed a 3-week flexible endoscopy curriculum developed at our institution. All trainees completed 3 online modules. Module 1 described the main characteristics of the flexible endoscopic equipment, familiarizing residents with the device. Module 2 described fundamental concepts of anatomy, pathology, and procedural techniques necessary to perform endoscopy in a safe fashion. Module 3 described proper utilization of the 2 training platforms available in our simulation center. To assess understanding of these concepts, a brief quiz at the end of each module was provided to advance to the following level. On completion of the online modules, each resident was randomly assigned to 1 of 3 training conditions based on equipment availability at our simulation center. One-third

Journal of Surgical Education  Volume 72/Number 2  March/April 2015

221

of the residents were assigned to train exclusively using the GI Mentor II (virtual reality) platform (Simbionix Corp., Cleveland, OH), one-third used the Kyoto Kagaku colonoscopy (physical model) simulator (Kyoto Kagaku Co. Ltd., Kyoto, Japan), and the remaining third trained using both simulators. The GI Mentor II simulator employs a modified flexible endoscope, which is introduced into a simulated patientʼs “body” that contains sensors and mechanical devices designed to manipulate the endoscopeʼs progress through the colon, which provides haptic feedback such as the presence of a loop in the colon. Trainees view a computer-generated 3-dimensional representation of the colon on a computer monitor. The GI Mentor II includes 2 exercises designed to provide practice with controlling the endoscope (Endo-Bubble I and Endo-Bubble II) and 10 colonoscopy exercises that range in difficulty from simple to difficult, and several of the exercises include loops, polyps, and tumors. Residents using the GI Mentor II were asked to complete the Endo-Bubble I and Endo-Bubble II exercises until reaching proficiency based on prior published criteria.12 Additionally, residents were asked to complete at least 1 trial of the 10 virtual colonoscopy cases available in the simulator. Each intern was asked to record the total number of practice trials, time to complete the exercise, time to reach the cecum, and percentage of time with a clear view of the lumen displayed by the software after completion of each case. The Kyoto colonoscopy simulator is a physical model simulator that employs a flexible rubber colon situated in a plastic abdomen. The Kyoto simulator includes 6 cases of variable difficulty depending on the presence of loops. Usage of the Kyoto simulator involves removing the colon from the abdomen, affixing a case template with hook-and-loop fasteners, routing the colon along the template using rubber bands affixed to the base of the trainer with hook-and-loop fasteners, filling the colon with a mixture of lubricant and water, and applying lubricating gel to the endoscope. Trainees using the Kyoto simulator completed at least 1 trial of the 6 different modules available in this system and were asked to record the same metrics described previously, with the exception of percentage of time with a clear view of the lumen, given the difficulty in doing so without the aid of a computer-based program. At the end of the skills development rotation, each intern completed a posttest colonoscopy on an actual patient under the guidance of an experienced faculty based on the same inclusion criteria used during the pretest. Performance was scored using the GAGES-C tool described earlier. Additionally, during the pretest and posttest colonoscopies, an external observer quantified other metrics including total time to complete colonoscopy, time to reach the cecum, time with a clear view of the lumen, number of times faculty took full control of the colonoscope, and need for endoscopic instrumentation. Despite our attempts to select patients with similar characteristics from pretest to posttest, other unique and intrinsic factors such as the quality of 222

bowel preparation and case difficulty could not be determined until the procedure was completed. The experienced faculty member was asked to score these 2 features based on a 5-point Likert scale where 1 was assigned to a poor or inadequate preparation and 5 to a clean colon. A 5-point Likert scale was assigned to case difficulty where 1 was considered straightforward procedure and 5 for a complex, difficult colonoscopy. It should be noted that the expert faculty scoring both the GAGES-C performance and colonoscopy conditions was blinded regarding the training condition but not to pretest or posttest situation. On completion of the study, a survey was administered to gather demographic information and evaluate the utility of the educational online modules, simulator preference, level of realism and difficulty, self-reported levels of anxiety, and self-reflected colonoscopy performance. SPSS version 19 (IBM Corp., Armonk, NY) was used for statistical analysis.

RESULTS Demographics PGY-1 residents were equally randomized in 3 groups. Median age, sex distribution, and hand dexterity were similar among training conditions. Additionally, interns trained for a similar number of days, irrespective of which simulator training condition they were assigned to (Table 1). Residents practiced each of the available exercises on the GI Mentor II, Kyoto, and both simulators an average of 3, 2.5, and 2.2 times, respectively. Nonetheless, trainees using the GI Mentor II either by itself or in combination with the Kyoto platform practiced at least twice as much as residents assigned exclusively to the Kyoto simulator in regard to the total number of attempted trials. Furthermore, interns practiced a similar number of times the Endo-Bubble I, Endo-Bubble II, and the 10 case exercises when they were assigned to the GI Mentor or the GI Mentor plus Kyoto training conditions (Table 2). A similar training ratio was obtained among conditions when total practice trials was divided from the total number of available exercises. Bowel Preparation and Case Difficulty No statistically significant differences were seen from pretest to posttest in quality of bowel preparation or case difficulty TABLE 1. Trainee Demographics Training Condition Number of residents Age (median), y Male:female ratio Right handed Training days (median)

GI Mentor Both II Kyoto Simulators 9 28 5:4 9 22

9 29 6:3 9 22

9 29 6:3 9 23

Journal of Surgical Education  Volume 72/Number 2  March/April 2015

TABLE 2. Number of Cases Performed per Training Condition Condition Number of trials EB-I (median) Number of trials EB-II (median) Number of trials GI mentor cases 1-10 (median) Number of trials with Kyoto (median) Total practice trials (median) Total number of available exercises Total practice trials/number of exercises

GI Mentor II

Kyoto

Both Simulators

12 13 11 0 36 12 3

0 0 0 15 15 6 2.5

14 9 10 7 40 18 2.2

EB-I, Endo-Bubble I; EB-II, Endo-Bubble II.

(Kruskal-Wallis test, p ¼ 0.608 and 0.791, respectively), showing similar characteristics and homogeneity for patient selection among assessment conditions.

GAGES Scores A statistically significant improvement in the GAGES-C total score and on each of its subcomponents (navigation, strategies, clear lumen, and quality of examination) was observed from pretest to posttest when analysis was performed for all groups combined (Wilcoxon signed-ranked test, all p r 0.001). When subgroup analysis was conducted, trainees using the GI Mentor II or both simulators showed significant improvement from pretest to posttest in GAGES-C total score (Wilcoxon signed-ranked test, p ¼ 0.017 and 0.024, respectively). Similarly, residents who trained using the Kyoto simulator exclusively demonstrated improved GAGES-C performance from pretest to posttest; however, this improvement was not significant (Wilcoxon signed-ranked test, p ¼ 0.072; Fig. 1). An analysis of the 4 subcomponents selected from GAGES-C showed that residents training with the GI Mentor II by itself or in combination with the Kyoto physical model simulator had significant improvement in their “scope navigation” skills. Additionally, significant improvement in the “quality of examination” was observed

in trainees assigned to the GI Mentor II condition, while interns training with both simulators showed better “use of strategies” at posttest. Interestingly, the only GAGES-C subitem that showed improvement for residents assigned to the Kyoto simulator was the “ability to keep a clear endoscopic field” (Wilcoxon signed-ranked test, p ¼ 0.025; Table 3). Despite these findings, no single training condition was shown to be a better training modality when compared with each other in total GAGES-C score or in any of its subcomponents when Kruskal-Wallis analysis was performed. Total Colonoscopy Time, Time to Reach the Cecum, and Percentage With a Clear Lumen PGY-1 residents assigned to the GI Mentor II had faster total colonoscopy times and took less time to reach the cecum; however, they spent less time with a clear endoscopic lumen during posttest, illustrating a speed-accuracy trade-off. These observations were inverse in the other 2 conditions, with the exception that interns who trained with the Kyoto simulator spent more time with a clear lumen during posttest; however, this finding did not reach statistical significance (analysis of variance, p ¼ 0.084). This latter finding is consistent with the results obtained using the GAGES-C score in the same training condition (Table 4). Faculty Intervention Faculty intervention was calculated by subtracting the number of times faculty took control of the colonoscope to perform an endoscopic intervention from the number of times faculty took full control of the endoscopic procedure because the resident could not progress. No statistical difference was found among groups during pretest or posttest (chi-square test, p ¼ 0.84 and 0.36, respectively). These results show a similar degree of intervention among training conditions. Survey Data

FIGURE 1. GAGES score: pretest vs posttest.

Trainees described the online modules as moderately to somewhat useful. Independent of training condition,

Journal of Surgical Education  Volume 72/Number 2  March/April 2015

223

TABLE 3. GAGES Total Score and Subitem Analysis: Pretest vs Posttest Simulator GI Mentor II Kyoto Both All combined

GAGES Total

Navigation

Strategies

Clear Lumen

Quality Examination

p ¼ 0.017 NS p ¼ 0.024 p o 0.001

p ¼ 0.014 NS p ¼ 0.038 p ¼ 0.001

NS NS p ¼ 0.014 p ¼ 0.001

NS p ¼ 0.025 NS p ¼ 0.001

p ¼ 0.034 NS p ¼ 0.053 p ¼ 0.001

NS, not significant.

residents found the simulators to be a useful part of their training. Interestingly, the Kyoto platform was rated as being more realistic than the GI Mentor; however, it was preferred less, given a higher level of difficulty and longer setup time in comparison with the virtual reality simulator (Fig. 2). A statistically significant decrease in self-reported anxiety from pretest to posttest was observed among PGY-1 residents assigned to the GI Mentor II or Kyoto groups (Wilcoxon signed-ranked test, p ¼ 0.014 and 0.038, respectively). Additionally, self-reported performance improvement was noted during posttest for the same training conditions (p ¼ 0.005 and 0.008, respectively). However, these differences were not significant for trainees assigned to both simulators.

GAGES-C subcomponent (clear lumen). Residents who trained with the Kyoto exclusively or in combination with the GI Mentor II required more time to reach the cecum on the posttest than on the pretest. This finding could be owing to the fact that residents assigned to these conditions performed fewer practice trials than residents assigned to train exclusively with the GI Mentor II, as illustrated in Table 2. This could be because the Kyoto simulator requires significantly more preparation before usage. With the Kyoto simulator, one must rearrange templates and route the colon before starting a training exercise. In contrast, the GI Mentor II only requires the trainee to push the power button and select an exercise from an on-screen menu. Furthermore, the Kyoto simulator requires the trainee to fill the colon with a mixture of water and lubricant and apply lubricating gel to the endoscope. This lubricant has a tendency to escape the simulated rectum and is quite messy. The GI Mentor II does not present this inconvenience. Finally, the Kyoto simulator cannot provide an instant performance summary. However, the GI Mentor II provides a summary that includes time to cecum, total time, percentage of time with a clear view of the lumen, and instances of applying too much force. These limitations of the Kyoto simulator may have been the reason why these residents practiced less with this simulator. No training condition was shown to be superior to the others in the total GAGES-C scores or any of its subcomponents. Interestingly, self-reported levels of anxiety and posttest performance were significantly improved among trainees using the GI Mentor II by itself but not in combination with the Kyoto platform. We assume a possible explanation for this finding is the fact that trainees that used both simulators might have felt overwhelmed by the number of exercises needed to complete before their posttesting session. By nature of randomization, trainees assigned to both simulators had an overall higher number of exercises than

DISCUSSION The current study compared 3 different training modalities using 2 different platforms and evaluated the transfer of skills acquired in the simulation laboratory to clinical practice. Gain of skills from pretest to posttest, as measured by the GAGES-C total score, was noted for the 3 different training groups; however, it was only statistically significant for PGY-1 residents assigned to conditions where the GI Mentor II was used exclusively or in combination with the Kyoto model. Residents assigned to train exclusively with the GI Mentor II had significant pretest to posttest improvements in 2 of the GAGES-C subcomponents (navigation and quality of examination). Residents assigned to train with the GI Mentor and Kyoto showed significant improvements in 3 of the GAGES-C subcomponents (navigation, strategies, and quality of examination). In contrast, residents assigned to train exclusively with the Kyoto simulator showed significant improvement in only 1

TABLE 4. Total Colonoscopy Time, Time to Reach the Cecum, Time With a Clear Lumen (Minutes): Pretest vs Posttest Condition Pretest total colonoscopy time (median) Posttest total colonoscopy time (median) Pretest time to reach the cecum (median) Posttest time to reach the cecum (median) Pretest time with a clear lumen (median) Posttest time with a clear lumen (median) 224

GI Mentor II

Kyoto

Both Simulators

27.7 23.9 16.8 14.8 16.6 15.9

26.8 28.2 11.3 13.7 13.7 15.2

22 23.7 10.1 19 19 17.3

Journal of Surgical Education  Volume 72/Number 2  March/April 2015

FIGURE 2. Survey results: usefulness, realism, and difficulty between simulators.

their counterparts. One might assume they would outperform participants in the Kyoto or GI mentor training conditions and thus feel more confident about their skills, when in fact no statistical difference was found, suggesting that more trials is not necessarily better. Dawe et al.19 performed a systematic review of 10 studies investigating the effects of multiple endoscopic simulators (AccuTouch,20-23 GI Mentor,24,25 Gastro-Sim,26 KAISTEwha colonoscopy simulator II,27 Olympus ENDO TS-1 colonoscopy simulator,28 and sigmoidoscopy simulator29) on transfer of skills. Improved overall performance,21-23 completion of colonoscopy cases in a significantly shorter time,19,25 and less patient discomfort20,22,23 was shown among trainees assigned to a simulation environment before actual clinical exposure. Nonetheless, all these studies used computer-based virtual reality platforms, and no comparisons were made regarding the use of less sophisticated and more economical physical model simulators. To date, there has been no empirical comparison of the effectiveness between endoscopic physical model and virtual reality simulators. Diesen et al.30 found that both laparoscopic box trainers and virtual reality simulators were equally effective means of teaching laparoscopic skills (camera navigation, instrument handling, object positioning, dissection, ligation, suturing, and knot tying) to novice learners when exposed to a live porcine model at 3 different time points (0, 2, or 6 mo). This is a significant finding, especially for residency training programs uncertain of which platform can be successfully implemented to teach the Fundamentals of Laparoscopic Surgery program,31-33 which has been developed by the Society of American Gastrointestinal and Endoscopic Surgeons and endorsed by the ABS. Hill et al.34 performed a comparative analysis of the commercially available devices for colonoscopy simulation using a 59-item instrument (Colonoscopy Simulator Realism Questionnaire) provided to 19 experts in this field to assess their level of fidelity. The GI Mentor II was found to be significantly less realistic than the Kyoto simulator but had a similar degree of difficulty as the other simulators. The first observation was shared among our trainees when

asked to score on a 5-point Likert scale the level of realism of their assigned simulators. However, PGY-1 residents in our study found the Kyoto platform to be more difficult than a real colonoscopy. Additionally, the GI Mentor and other virtual platforms were shown to be easier to setup and transport and provide immediate feedback than their physical model counterparts. Again, our trainees reported similar findings regarding the use of the Kyoto. Specifically, they found it to be less practical, given the need to lubricate the system intermittently, change the different templates for the 6 available modules, and need to manually record their performance data instead of having immediate softwareproduced feedback as seen on the virtual platform. Despite these findings, Hill et al.34 concluded their results could not suggest a clear “first-choice” simulator to training program directors who are faced with the dilemma of which device to purchase given the significant difference in cost between them. From our experience, it seems the GI Mentor II should be considered as the preferred platform to train surgical residents in endoscopic skills. In spite of the apparent cost difference it is important to recognize that the physical model platforms require additional equipment, including a real colonoscope, which adds extra expense and can almost match the price of its virtual reality counterpart. The GI Mentor II not only allows practicing lower but also upper GI endoscopy and can potentially include bronchoscopy modules with the purchase of an extension package. Additionally, this simulator was recently selected to be the official platform of the Fundamentals of Endoscopy Surgery program technical skills assessment,35-37 which has recently become a mandatory certification such as Fundamentals of Laparoscopic Surgery for the upcoming generations of general surgeons. Furthermore, it is worth mentioning that from the traineesʼ perspective, there seems to be a preference for the virtual reality platform as the user interface, immediate feedback on completion of each module, and versatility of cases are more appealing than the physical model counterpart.

LIMITATIONS In our study, no training condition was shown to be superior in the total GAGES-C score or any of its subcomponents. Our trainees were similarly distributed among groups on the basis of sex, age, hand dexterity, and training days. Their patientsʼ characteristics regarding case difficulty and degree of bowel preparation was also similar among groups. Given the nature of the conditions, trainees assigned to both simulators completed a higher number of total practice trials than residents under the physical simulator condition did (40 vs 15, respectively). Likewise, residents who were assigned to train exclusively with the GI Mentor II or in combination with the Kyoto model trained to proficiency in the Endo-Bubble 1 and

Journal of Surgical Education  Volume 72/Number 2  March/April 2015

225

Endo-Bubble 2 exercises based on prior expert-derived proficiency metrics.12 These factors could have contributed to a better performance from pretest to posttest as observed in our results. The lack of a control group that was not exposed to simulator training fails to show if the acquired gain of skills is because of the acquisition of skills developed during simulation practice rather than by mere familiarization with the equipment and procedure as it occurs during pretest. Even though attempts were made to have this fourth group in our analysis, scheduling issues and patient availability precluded us from integrating this fourth training condition into our study. GAGES has shown to clearly differentiate different levels of expertise in technical skills performance.7,8 Nonetheless, it is important to recognize that it has some intrinsic limitations. In our study, we used a single, nonblinded reviewer for the pretest and posttesting in real time. Although the reviewer was blinded to the training condition, it is possible that they were aware of the traineeʼs preassessment vs postassessment status. Likewise, the pretraining and posttraining scores were based on single-trial data rather than averaging across multiple trials. Logistical limitations (time in the endoscopy suite, availability of trainee or staff or both, and appropriateness of patient selection) were the reasons for this. In addition, patient-related factors (e.g., degree of case difficulty and quality of bowel preparation) can affect endoscopic performance independent of the inherent endoscopic skills. Thus, an individualʼs final score might not reflect the subjectʼs actual endoscopic acumen. In our analysis, we did not find a difference in case difficulty or colon cleanliness from pretest to posttest, which allowed us to attribute the gain of skills to the training condition residents were assigned as opposed to varying patient conditions.

CONCLUSION The skills acquired on endoscopic simulators do transfer to the real world. We demonstrated significant performance gains for residents who trained using the GI Mentor exclusively and those who trained using the GI Mentor in conjunction with the Kyoto. However, our analyses did not show superiority of a single training condition. Despite these findings, it is important to take trainee feedback and simulator preference into consideration. Otherwise the simulation laboratory may acquire devices that ultimately are not used. We have directed our trainees to practice with the GI Mentor II virtual reality platform to the established proficiency benchmarks for Endo-Bubble I and EndoBubble II and complete the 10 colonoscopy cases. Additionally, we recommend the use of the Kyoto physical model simulator for residents more advanced in their 226

endoscopic training, given its higher level of complexity and difficulty.

REFERENCES 1. Seeff LC, Richards TB, Shapiro JA, et al. How many

endoscopies are performed for colorectal cancer screening? Results from CDCʼs survey of endoscopic capacity. Gastroenterology. 2004;127(6):1670-1677. 2. Eisen GM, Baron TH, Dominitz JA, et al. Methods of

granting hospital privileges to perform gastrointestinal endoscopy. Gastrointest Endosc. 2002;55(7):780-783. 3. The American Board of Surgery. ABS Statement on GI

Endoscopy. 2011 Available at: 〈http://www.absurgery.org/ default.jsp?newsgiresponse#GI%20Societies/〉 Accessed 3.3.14.

4. Wexner SD, Garbus JE, Singh JJ. SAGES Colono-

scopy Study Outcomes Group. A prospective analysis of 13,580 colonoscopies. Reevaluation of credentialing guidelines. Surg Endosc. 2001;15(3):251-261. 5. Accreditation Council for Graduate Medical Educa-

tion. Memorandum: changes in minimum requirements for laparoscopy and endoscopy. 2006 Available at: 〈http://www.dconnect.acgme.org/acWebsite/RRC_ 440/440_policyArchive.asp〉. 6. Cass OW, Freeman ML, Peine CJ, Zera RT, Onstad

GR. Objective evaluation of endoscopy skills during training. Ann Intern Med. 1993;118(1):40-44. 7. Vassiliou MC, Kaneva PA, Poulose BK, et al. Global

Assessment of Gastrointestinal Endoscopic Skills (GAGES): a valid measurement tool for technical skills in flexible endoscopy. Surg Endosc. 2010;24(8): 1834-1841. 8. Vassiliou MC, Kaneva PA, Poulose BK, et al. How

should we establish the clinical case numbers required to achieve proficiency in flexible endoscopy? Am J Surg. 2010;199(1):121-125. 9. Ericsson KA, Krampe RT, Tesch-Romer C. The role

of deliberate practice in the acquisition of expert performance. Psychol Rev. 1993;100(3):363-406. 10. Bittner JG 4th, Marks JM, Dunkin BJ, Richards WO,

Onders RP, Mellinger JD. Resident training in flexible gastrointestinal endoscopy: a review of current issues and options. J Surg Educ. 2007;64(6):399-409.

11. Morales MP, Mancini GJ, Miedema BW, et al.

Integrated flexible endoscopy training during surgical residency. Surg Endosc. 2008;22(9):2013-2017.

Journal of Surgical Education  Volume 72/Number 2  March/April 2015

12. Van Sickle KR, Buck L, Willis R, et al. A multicenter,

25. Shirai Y, Yoshida T, Shiraishi R, et al. Prospective

simulation-based skills training collaborative using shared GI Mentor II systems: results from the Texas Association of Surgical Skills Laboratories (TASSL) flexible endoscopy curriculum. Surg Endosc. 2011; 5(9):2980-2986.

randomized study on the use of a computer-based endoscopic simulator for training in esophagogastroduodenoscopy. J Gastroenterol Hepatol. 2008;23(7 Pt 1):1046-1050.

13. Van Sickle KR, Ritter EM, Smith CD. The pretrained

novice: using simulation-based training to improve learning in the operating room. Surg Innov. 2006;13 (3):198-204. 14. Markman HD. A new system for teaching proctosig-

moidoscopic morphology. Am J Gastroenterol. 1969;52 (1):65-69. 15. Dunkin BJ. Flexible endoscopy simulators. Semin

Laparosc Surg. 2003;10(1):29-35. 16. Gerson LB, Van Dam J. Technology review: the use of

simulators for training in GI endoscopy. Gastrointest Endosc. 2004;60(6):992-1001. 17. Desilets DJ, Banerjee S, Barth BA, et al. Endoscopic

simulators. Gastrointest Endosc. 2011;73(5):861-867.

26. Tuggy ML. Virtual reality flexible sigmoidoscopy

simulator training: impact on resident performance. J Am Board Fam Pract. 1998;11(6):426-433. 27. Yi SY, Ryu KH, Na YJ, et al. Improvement of

colonoscopy skills through simulation-based training. Stud Health Technol Inform. 2008;132:565-567. 28. Haycock A, Koch AD, Familiari P, et al. Training and

transfer of colonoscopy skills: a multinational, randomized, blinded, controlled trial of simulator versus bedside training. Gastrointest Endosc. 2010; 71(2):298-307. 29. Gerson LB, Van Dam J. A prospective randomized

trial comparing a virtual reality simulator to bedside teaching for training in sigmoidoscopy. Endoscopy. 2003;35(7):569-575.

18. Triantafyllou K, Lazaridis LD, Dimitriadis GD. Vir-

30. Diesen DL, Erhunmwunsee L, Bennett KM, et al.

tual reality simulators for gastrointestinal endoscopy training. World J Gastrointest Endosc. 2014;6(1):6-12. http://dx.doi.org/10.4253/wjge.v6.i1.6 [January 16, 2014].

Effectiveness of laparoscopic computer simulator versus usage of box trainer for endoscopic surgery training of novices. J Surg Educ. 2011;68(4):282-289.

19. Dawe SR, Windsor JA, Broeders JA, Cregan PC,

Hewett PJ, Maddern GJ. A systematic review of surgical skills transfer after simulation-based training: laparoscopic cholecystectomy and endoscopy. Ann Surg. 2014;259(2):236-248. 20. Ahlberg G, Hultcrantz R, Jaramillo E, Lindblom A,

Arvidsson D. Virtual reality colonoscopy simulation: a compulsory practice for the future colonoscopist? Endoscopy. 2005;37(12):1198-1204. 21. Park J, MacRae H, Musselman LJ, et al. Randomized

controlled trial of virtual reality simulator training: transfer to live patients. Am J Surg. 2007;194(2): 205-211. 22. Sedlack RE, Kolars JC, Alexander JA. Computer

simulation training enhances patient comfort during endoscopy. Clin Gastroenterol Hepatol. 2004;2(4): 348-352. 23. Sedlack RE, Kolars JC. Computer simulator training

enhances the competency of gastroenterology fellows at colonoscopy. Am J Gastroenterol. 2004;99(1):33-37. 24. Cohen J, Cohen SA, Vora KC, et al. Multicenter,

randomized, controlled trial of virtual-reality simulator training in acquisition of competency in colonoscopy. Gastrointest Endosc. 2006;64(3):361-368.

31. Fried GM, Feldman LS, Vassiliou MC, et al. Proving

the value of simulation in laparoscopic surgery. Ann Surg. 2004;240(3):518-525. 32. Peters JH, Fried GM, Swanstrom LL, et al. Develop-

ment and validation of a comprehensive program of education and assessment of the basic fundamentals of laparoscopic surgery. Surgery. 2004;135(1):21-27. 33. Vassiliou MC, Ghitulescu GA, Feldman LS, et al. The

MISTELS program to measure technical skill in laparoscopic surgery: evidence for reliability. Surg Endosc. 2006;20(5):744-747. 34. Hill A, Horswill MS, Plooy AM, et al. Assessing the

realism of colonoscopy simulation: the development of an instrument and systematic comparison of 4 simulators. Gastrointest Endosc. 2012;75(3):631-640. 35. Vassiliou MC, Dunkin BJ, Marks JM, Fried GM. FLS

and FES: comprehensive models of training and assessment. Surg Clin North Am. 2010;90(3):535-558. 36. Vassiliou MC, Dunkin BJ, Fried GM, et al. Fundamen-

tals of endoscopic surgery: creation and validation of the hands-on test. Surg Endosc. 2014;28(3):704-711. 37. Hazey JW, Marks JM, Mellinger JD, et al. Why

fundamentals of endoscopic surgery (FES)? Surg Endosc. 2014;28(3):701-703.

Journal of Surgical Education  Volume 72/Number 2  March/April 2015

227

Evaluation of two flexible colonoscopy simulators and transfer of skills into clinical practice.

Surgical residents have learned flexible endoscopy by practicing on patients in hospital settings under the strict guidance of experienced surgeons. S...
263KB Sizes 1 Downloads 3 Views