1798 C OPYRIGHT Ó 2014

BY

T HE J OURNAL

OF

B ONE

AND J OINT

S URGERY, I NCORPORATED

Improving Residency Training in Arthroscopic Knee Surgery with Use of a Virtual-Reality Simulator A Randomized Blinded Study W. Dilworth Cannon, MD, William E. Garrett Jr., MD, PhD, Robert E. Hunter, MD, Howard J. Sweeney, MD, Donald G. Eckhoff, MD, MS, Gregg T. Nicandri, MD, Mark R. Hutchinson, MD, Donald D. Johnson, MD, FRCS, Leslie J. Bisson, MD, Asheesh Bedi, MD, James A. Hill, MD, Jason L. Koh, MD, and Karl D. Reinig, PhD Investigation performed at the University of California, San Francisco, San Francisco, California; Northwestern University, Chicago, Illinois; New York University Hospital for Joint Diseases, New York, NY; Harvard University, Cambridge, Massachusetts; Massachusetts General Hospital, Boston, Massachusetts; University of Minnesota, Minneapolis, Minnesota; University of Iowa, Iowa City, Iowa; and University of Washington, Seattle, Washington

Background: There is a paucity of articles in the surgical literature demonstrating transfer validity (transfer of training). The purpose of this study was to assess whether skills learned on the ArthroSim virtual-reality arthroscopic knee simulator transferred to greater skill levels in the operating room. Methods: Postgraduate year-3 orthopaedic residents were randomized into simulator-trained and control groups at seven academic institutions. The experimental group trained on the simulator, performing a knee diagnostic arthroscopy procedure to a predetermined proficiency level based on the average proficiency of five community-based orthopaedic surgeons performing the same procedure on the simulator. The residents in the control group continued their institutionspecific orthopaedic education and training. Both groups then performed a diagnostic knee arthroscopy procedure on a live patient. Video recordings of the arthroscopic surgery were analyzed by five pairs of expert arthroscopic surgeons blinded to the identity of the residents. A proprietary global rating scale and a procedural checklist, which included visualization and probing scales, were used for rating. Results: Forty-eight (89%) of the fifty-four postgraduate year-3 residents from seven academic institutions completed the study. The simulator-trained group averaged eleven hours of training on the simulator to reach proficiency. The simulatortrained group performed significantly better when rated according to our procedural checklist (p = 0.031), including probing skills (p = 0.016) but not visualization skills (p = 0.34), compared with the control group. The procedural checklist weighted probing skills double the weight of visualization skills. The global rating scale failed to reach significance (p = 0.061) because of one extreme outlier. The duration of the procedure was not significant. This lack of a significant difference seemed to be related to the fact that residents in the control group were less thorough, which shortened their time to completion of the arthroscopic procedure. Conclusions: We have demonstrated transfer validity (transfer of training) that residents trained to proficiency on a highfidelity realistic virtual-reality arthroscopic knee simulator showed a greater skill level in the operating room compared with the control group. Clinical Relevance: We believe that the results of our study will stimulate residency program directors to incorporate surgical simulation into the core curriculum of their residency programs.

Peer Review: This article was reviewed by the Editor-in-Chief and one Deputy Editor, and it underwent blinded review by two or more outside experts. The Deputy Editor reviewed each revision of the article, and it underwent a final review by the Editor-in-Chief prior to publication. Final corrections and clarifications occurred during one or more exchanges between the author(s) and copyeditors.

Disclosure: One or more of the authors received payments or services, either directly or indirectly (i.e., via his or her institution), from a third party in support of an aspect of this work. In addition, one or more of the authors, or his or her institution, has had a financial relationship, in the thirty-six months prior to submission of this work, with an entity in the biomedical arena that could be perceived to influence or have the potential to influence what is written in this work. No author has had any other relationships, or has engaged in any other activities, that could be perceived to influence or have the potential to influence what is written in this work. The complete Disclosures of Potential Conflicts of Interest submitted by authors are always provided with the online version of the article.

J Bone Joint Surg Am. 2014;96:1798-806

d

http://dx.doi.org/10.2106/JBJS.N.00058

1799 TH E JO U R NA L O F B O N E & JO I N T SU RG E RY J B J S . O RG V O LU M E 9 6-A N U M B E R 2 1 N O V E M B E R 5, 2 014 d

d

S

d

imulation of orthopaedic procedures is becoming an increasingly important training and educational modality. The Accreditation Council for Graduate Medical Education (ACGME) in 2013 added a requirement that residency programs have a surgical-skills training curriculum as part of their residency program. Also, the public is becoming better educated and will demand better accountability for surgical interventions. Demonstrating surgical skills with use of simulation may become a requirement for certification and/or maintenance of certification for practicing surgeons. Training with simulation should result in residents making fewer surgical errors1,2 and should shorten the time needed to perform surgical procedures at academic institutions, where resident training slows the efficiency of the operating room3-5. Proficiency-based training is likely to become a standard in the near future6. Several published papers have documented construct validity whereby a surgical simulator can differentiate between surgeons with various levels of surgical skill7-12. To our knowledge, there is only one orthopaedic paper13 that has shown transfer validity (transfer of training14) in which surgeons trained on a simulator demonstrated a higher skill level in the operating room than those not trained on a simulator. In general surgery, there are more papers showing transfer of training to surgical procedures2,14-20. Other authors have shown transfer validity from simulators to procedures on animal and cadaveric specimens21-23. In this study, we tested a simulation-based educational model against traditional training in residency programs. We hypothesized that postgraduate year (PGY)-3 orthopaedic residents trained on a high-fidelity virtual-reality arthroscopic knee simulator to perform a diagnostic arthroscopic procedure in the knee would complete the same procedure on an actual live patient with greater thoroughness and accuracy and in less time compared with a control group of PGY-3 residents learning and

IMPROVING RESIDENCY TR AINING IN ARTHROSCOPIC KNEE S U R G E R Y U S I N G A V I R T UA L -R E A L I T Y S I M U L AT O R

practicing arthroscopic skills with only traditional methods of instruction at their respective institutions. The traditional training typically includes learning by observation and apprenticeship as well as practice on cadaveric specimens and knee models. Materials and Methods Simulator

T

he ArthroSim (Touch of Life Technologies, Aurora, Colorado) virtualreality arthroscopic knee simulator (Fig. 1) was used for this study. The virtual-reality knee joint was based on segmentation of approximately two thousand 0.1-mm cryosections of a right knee from the cadaver of a twentyeight-year-old. Both the simulated arthroscope and probe were provided with haptics (a tactile feedback technology that takes advantage of the sense of touch by applying forces, vibrations, or motions to the user), allowing six degrees of motion and three degrees of force (Geomagic, Morrisville, North Carolina). The surrogate right lower extremity had a range of motion from 0° to 90° of flexion and was sensitive to valgus and varus forces measured by the degrees of 24 medial and lateral compartment opening . The simulator was neither upgraded nor changed during the course of the study.

Study Design The American Academy of Orthopaedic Surgeons (AAOS) sent out letters to all orthopaedic residency programs in the United States with at least six PGY-3 residents, inviting them to participate in this study. Seven institutions responded positively, demonstrating an interest in participation. The names of these institutions, the number of residents in their programs, and the number of residents who actually completed the study are shown in Table I. At each site, the lead author, simulator inventor, and AAOS facilitator introduced both the study and the simulator to the PGY-3 residents and orthopaedic staff. Also at each site, the PGY-3 residents were randomly assigned to either the simulator-trained (experimental) or non-simulator-trained (control) group. The assignment was done at a central site by pulling numbers out of a box. All participants completed a demographic questionnaire (see Appendix) and performed a hand-eye coordination test during orientation. These data were used to determine whether the randomization process resulted in equivalent groups in terms of previous-experience variables that might bias the results of the study. Participants were then given a

Fig. 1

Photograph of the ArthroSim virtual-reality arthroscopic knee simulator.

1800 TH E JO U R NA L O F B O N E & JO I N T SU RG E RY J B J S . O RG V O LU M E 9 6-A N U M B E R 2 1 N O V E M B E R 5, 2 014 d

d

d

IMPROVING RESIDENCY TR AINING IN ARTHROSCOPIC KNEE S U R G E R Y U S I N G A V I R T UA L -R E A L I T Y S I M U L AT O R

TABLE I Academic Institutions Participating in the Validation Study

Academic Institution

No. of Available PGY-3 Residents Who Participated

Institutional Facilitators

University of California, San Francisco

6 of 6

Northwestern University

9 of 9

James A. Hill, MD, Geoffrey White

10 of 12

Laith M. Jazrawi, MD, Brian Pahk

New York University Hospital for Joint Diseases

W. Dilworth Cannon, MD, C. Benjamin Ma, MD

Harvard University, Massachusetts General Hospital

8 of 12

Dinesh Patel, MD, Sarah Haddad, Katherine Redford

University of Minnesota

7 of 7

Ann E. Van Heest, MD, Conrad Lindquist

University of Iowa

6 of 6

Brian R. Wolf, MD

University of Washington*

2 of 2

John R. Green, MD

Total

48 of 54

*Data from six residents at the University of Washington were not included because of an irregularity in the institutional review board application.

general orientation to the study and were provided with a handbook outlining the logistics to be followed by both the simulator and non-simulator-trained groups. To ensure that both groups would have similar access to similar cognitive information related to knee arthroscopy, all participants were required to watch a fifteen-minute video depicting the diagnostic knee arthroscopy procedure on a live patient. The handbook included a paper describing the procedure and a list detailing the procedural tasks. Residents were able to access these educational materials throughout the study and were told to watch the video and read the written material multiple times. During the study, participants in both groups continued their institutionspecific orthopaedic education and training. The simulator-trained group used the simulator to practice the techniques of knee arthroscopy. The simulator’s

mentor program, displayed on the left monitor screen, presented a curriculum for learning the diagnostic knee arthroscopy procedure defined by the authors of this paper. The mentor program provided video clips of live surgery explaining how the procedure should be done on the simulator for each task. Participants were required to complete each task of the visualization and probing procedures and score 100% before proceeding to the next task. Table II lists the visualization and probing tasks for this procedure. The mentor program included four rounds of training. During the first round, a hint screen (showing the correct arthroscope position and direction of view) and the flexion angle and valgus and varus opening were available to the participant, and time for completing the task was unlimited. The second round required completion of the tasks without the hint pictures, flexion angle, or valgus/varus opening, and time

TABLE II Rating Checklist of Visualized and Probed Structures* Visualized Structures

Probed Structures

Suprapatellar pouch: medial and lateral recesses

Medial meniscus: posterior, middle, and anterior† thirds

Patella: medial and lateral facets Trochlear groove: medial and lateral sides

Medial femoral condyle articular cartilage, including at 70° to 90° of knee flexion

Patellofemoral articulation

Medial tibial plateau articular cartilage

Medial recess†

Posterior‡ and anterior cruciate ligaments

Medial femoral condyle articular cartilage, including at 70° to 90° of knee flexion

Lateral meniscus: posterior, middle, and anterior‡ thirds

Medial tibial plateau articular cartilage

Lateral femoral condyle articular cartilage, including at 70° to 90° of knee flexion

Medial meniscus: posterior, middle, and anterior‡ thirds

Lateral tibial plateau articular cartilage

Posterior‡ and anterior cruciate ligaments

Posteromedial compartment (visualized)‡ and posterior horn medial meniscus (probed)

Lateral femoral condyle articular cartilage, including at 70° to 90° of knee flexion Lateral tibial plateau articular cartilage

Posterolateral compartment (visualized)‡ and posterior horn lateral meniscus (seen)

Lateral meniscus: posterior, middle, and anterior‡ thirds Lateral gutter, including popliteus hiatus *Visualized and probed structures were rated on a quantitative adjectival scale of 0 to 3, in which 0 = not seen or probed, 1 = poorly visualized or probed, 2 = adequately visualized or probed, and 3 = well visualized or probed. †The medial recess was eliminated from the final assessment because of confusion among the raters as to its anatomic limits. ‡Visualization and/or probing of the structures marked with a dagger was included in the simulator training but was not evaluated during surgery on live patients.

1801 TH E JO U R NA L O F B O N E & JO I N T SU RG E RY J B J S . O RG V O LU M E 9 6-A N U M B E R 2 1 N O V E M B E R 5, 2 014 d

d

d

IMPROVING RESIDENCY TR AINING IN ARTHROSCOPIC KNEE S U R G E R Y U S I N G A V I R T UA L -R E A L I T Y S I M U L AT O R

TABLE III Global Ratings of Arthroscopic Performance There was iteration of movement. 0: There was gross iteration of movement. 1: There was moderate iteration of movement. 2: There was mild iteration of movement. 3: There were a few iterations of movement during the examination, but otherwise it was effectively carried out. 4: There was no iteration of movement. The surgeon was gentle with the use of the arthroscope and instruments. 0: The surgeon caused significant articular cartilage injury with the arthroscope or probe. 1: The surgeon caused moderate intra-articular injury with the arthroscope or probe. 2: The surgeon caused mild intra-articular injury with the arthroscope or probe. 3: The surgeon caused minor intra-articular injury with the arthroscope or probe. 4: The surgeon was gentle with the use of the arthroscope and probe. The diagnostic examination of the knee was thorough and complete with and without the probe. 0: Less than 70% of examination was accomplished. 1: Between 70% and 80% of examination was accomplished. 2: Between 80% and 90% of examination was accomplished. 3: Between 90% and 95% of examination was accomplished. 4: Between 95% and 100% of examination was accomplished. The skill at triangulation between arthroscope and probe. 0: The skill at triangulation between arthroscope and probe was poor, with the surgeon occasionally spending more than one minute locating the introduced instrument. 1: The skill at triangulation between arthroscope and probe was fair, with the surgeon occasionally spending more than twenty seconds locating the introduced instrument. 2: The skill at triangulation between arthroscope and probe was good, but occasionally the surgeon spent more than fifteen seconds locating the introduced instrument. 3: The skill at triangulation between arthroscope and probe was good. 4: The skill at triangulation between arthroscope and probe was excellent. The 30° arthroscope was aligned properly in order to maximally visualize the structure that the surgeon was examining. 0: The 30° arthroscope was misaligned, and visualization was poor. 1: The 30° arthroscope was moderately misaligned, and visualization was not adequate. 2: The 30° arthroscope was misaligned frequently, and visualization of the anatomic structures was incomplete and only fair. 3: The 30° arthroscope was occasionally aligned poorly, but visualization of anatomic structures was adequate. 4: The 30° arthroscope was aligned properly in order to maximally visualize anatomic structures. The orientation of the arthroscopic video camera was correct. 0: The orientation of the arthroscopic video camera was poor, and visualization was poor. The camera was consistently out of plane, with resulting loss of any effective visualization. 1: The orientation of the arthroscopic video camera was moderately out of a vertical orientation, and visualization was not adequate. 2: The arthroscopic video camera was frequently not in a vertical orientation, and visualization of the anatomic structures was incomplete and only fair. 3: The arthroscopic video camera was occasionally not vertical, but visualization of anatomic structures was adequate. 4: The arthroscopic video camera was aligned vertically in order to maximally visualize anatomic structures. The tip of the arthroscope was positioned correctly throughout the examination. 0: The tip of the arthroscope was grossly malpositioned during the examination. 1: The tip of the arthroscope was moderately malpositioned during the examination, and anatomic structures were moderately incompletely visualized. 2: The tip of the arthroscope was mildly malpositioned during the examination, and anatomic structures were incompletely visualized. 3: The tip of the arthroscope was not quite positioned correctly, but anatomic structures were completely visualized. 4: The tip of the arthroscope was positioned correctly throughout the examination.

1802 TH E JO U R NA L O F B O N E & JO I N T SU RG E RY J B J S . O RG V O LU M E 9 6-A N U M B E R 2 1 N O V E M B E R 5, 2 014 d

d

d

still was unlimited. The third round required completion of the tasks within a set time that was 150% of the average time it took a group of five community-based orthopaedic surgeons to do it. To be selected, the community-based orthopaedic surgeons had to average fifty to 125 knee arthroscopy procedures annually. The final round entailed going through all of the tasks of the visualization part of the procedure with the timer running and required a proficiency score of at least 83% of the average proficiency score of the five community-based orthopaedic surgeons. Once this was completed, the same sequence was followed for the probing tasks. Once any resident in the simulator-trained group reached the predetermined level of proficiency on the simulator, fourteen days were allotted for that resident to perform a diagnostic knee arthroscopy procedure on a live patient under the supervision of an attending surgeon. Since it is known that there will be some degradation of a newly acquired skill with time, especially in 25,26 novices , failure to complete the surgery within fourteen days meant that the resident had to return to the simulator and retrain to proficiency again. The attending surgeons were blinded as to the resident group. After the first resident in the simulator group at a given site completed the training, residents in the control group were allowed to perform the surgery on a live patient. Residents were given twenty-five minutes to complete the surgery. If a resident exceeded that time, the attending surgeon took over the surgery and the resident was given scores of zero for the unfinished tasks. All arthroscopic surgery was recorded on DVDs, which were sent to the AAOS for later evaluation and scoring. All seven institutions had institutional review board approval for this study, and all subjects gave informed consent to participate.

Treatment Description Evaluation Instruments Arthroscopy Checklist

T

he Content Development Group (five members drawn from the AAOS project team on virtual-reality surgical simulation) created an evaluation checklist for residents’ knee arthroscopy performance on a live patient (Table II). The checklist had two sections: one for visualization tasks and one for probing tasks. Seven experts were added to the Content Development Group, bringing the total to twelve, to pilot-test the form by blindly evaluating five recorded live-patient procedures performed by two PGY-4 residents, two fellows, and an attending surgeon. With use of videoconferencing, the Content Development Group evaluated each checklist item for reliability and accuracy in rating video recordings of patient procedures. Checklist modification followed each pilot-test experience. A complete item analysis was performed on the study’s data to ensure that all items operated in an appropriate manner. In its final form, the procedural checklist consisted of twenty-one items: twelve related to the skill of visualization and nine related to the skill of probing. Several items were eliminated for a variety of reasons, as explained in the Discussion. All procedural items were rated on a quantitative adjectival scale of 0 to 3, as shown in Table II. The five pilot-test DVDs were then re-scored with use of this final form, which successfully distinguished between the performances of attending surgeons, fellows, and PGY-4 residents (p = 0.017), thereby providing evidence of construct validity. For the study, the two sections of the checklist (visualization and probing) were combined into a composite procedural score in a 1-to-2 weighted ratio, as the probing skill was judged to have greater importance and difficulty7.

IMPROVING RESIDENCY TR AINING IN ARTHROSCOPIC KNEE S U R G E R Y U S I N G A V I R T UA L -R E A L I T Y S I M U L AT O R

TABLE IV Patient Inclusion and Exclusion Criteria Exclusion criteria >25% joint space narrowing on standing radiograph Flexion contractures >10° or flexion

Improving residency training in arthroscopic knee surgery with use of a virtual-reality simulator. A randomized blinded study.

There is a paucity of articles in the surgical literature demonstrating transfer validity (transfer of training). The purpose of this study was to ass...
4MB Sizes 0 Downloads 7 Views