ORIGINAL REPORTS

Evaluation of Sensory and Motor Skills in Neurosurgery Applicants Using a Virtual Reality Neurosurgical Simulator: The Sensory-Motor Quotient Ben Z. Roitberg, MD, Patrick Kania, MS, Cristian Luciano, PhD, Naga Dharmavaram and Pat Banerjee, PhD Section of Neurosurgery, The University of Chicago, Chicago, Illinois OBJECTIVE: Manual skill is an important attribute for any surgeon. Current methods to evaluate sensory-motor skills in neurosurgical residency applicants are limited. We aim to develop an objective multifaceted measure of sensory-motor skills using a virtual reality surgical simulator. DESIGN: A set of 3 tests of sensory-motor function was performed using a 3-dimensional surgical simulator with head and arm tracking, collocalization, and haptic feedback. (1) Trajectory planning: virtual reality drilling of a pedicle. Entry point, target point, and trajectory were scored— evaluating spatial memory and orientation. (2) Motor planning: sequence, timing, and precision: hemostasis in a postresection cavity in the brain. (3) Haptic perception: touching virtual spheres to determine which is softest of the group, with progressive difficulty. Results were analyzed individually and for a combined score of all the tasks. SETTING: The University of Chicago Hospital’s tertiary

care academic center. PARTICIPANTS: A total of 95 consecutive applicants

interviewed at a neurosurgery residency program over 2 years were offered anonymous participation in the study; in 2 cohorts, 36 participants in year 1 and 27 participants in year 2 (validation cohort) agreed and completed all the tasks. We also tested 10 first-year medical students and 4 first- and second-year neurosurgery residents. RESULTS: A cumulative score was generated from the 3 tests. The mean score was 14.47 (standard deviation ¼ 4.37), median score was 13.42, best score was 8.41, and worst score was 30.26. Separate analysis of applicants from each of 2 years yielded nearly identical results. Residents

Correspondence: Inquiries to Ben Z. Roitberg, MD, Section of Neurosurgery, Department of Surgery, The University of Chicago, 5841 S, Maryland Ave, Chicago, IL 60637; e-mail: [email protected]

tended to cluster on the better performance side, and firstyear students were not different from applicants. CONCLUSIONS: (1) Our cumulative score measures

sensory-motor skills in an objective and reproducible way. (2) Better performance by residents hints at validity for neurosurgery. (3) We were able to demonstrate good psychometric qualities and generate a proposed sensorymotor quotient distribution in our tested population. C 2015 Association of Program Directors ( J Surg ]:]]]-]]]. J in Surgery. Published by Elsevier Inc. All rights reserved.) KEY WORDS: virtual reality, simulation, residency appli-

cants, skill testing COMPETENCIES: Patient Care

INTRODUCTION Neurosurgery residency applicants undergo a selection process that includes written applications and interviews. However, there is little opportunity to evaluate the basic sensory and motor skills of an applicant. Medical students have a limited opportunity for self-assessment using an objective and independent evaluation. Neurosurgery is a specialty where sensory-motor skills appear important, but we continue to chiefly rely on self-selection of applicants. Recent evidence suggests that interest in a surgical specialty compared with a medical specialty does not correlate with actual surgical skills.1,2 Successful identification of applicants who are likely to succeed in surgical fields remains elusive.3 We are not aware of literature specific to neurosurgery, but in general surgery, technical skill deficiency was reported in 8% of residents, or 35% of all performance problems in a program.4 To the best of our knowledge, there has been no published attempt to devise a psychometric system to evaluate sensory-motor skills for applicants to a neurosurgery residency. In our previous work, we

Journal of Surgical Education  & 2015 Association of Program Directors in Surgery. Published by 1931-7204/$30.00 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.jsurg.2015.04.030

1

presented a pilot study of our novel method to assess sensory and motor skills relevant to neurosurgery in a reproducible way.5 A total of 3 game-based tasks, which represent core neurosurgical skills, were designed on a virtual reality 3-dimensional (3D) haptic-enabled simulator. In the pilot study, we demonstrated that the scores on each task as well as a calculated combined score had approximately normal distribution, suggesting that our tests were able to distinguish among participants’ abilities and had psychometric properties. This suggested that we have been able to find a set of tasks that are neither too easy nor too difficult and have a gradual progression of difficulty. However, the number of participants was small, and the variability was relatively large. To provide evidence of reproducibility and expand the number of participants, we decided to validate our study on a separate group of applicants 1 year later. We also added new pilot data from neurosurgical residents and more junior medical students.

METHODS A total of 95 consecutive applicants to the University of Chicago neurosurgery residency program over 2 years were offered anonymous participation in the study, as described in our pilot study.5 Briefly, each participant was assigned a number, and no personal information was recorded anywhere, except to indicate whether the participant was a senior medical student, a first-year medical student, or a resident. Participation was voluntary, and whether an applicant to the program participated was unknown to anybody in the program. Participants were assessed based on raw scores and on how they did relative to their peers. A total of 3 sensorymotor games were designed. For detailed description of the tasks, please refer to our pilot study.5 We selected the tasks based on their perceived relevance to neurosurgery. They address 3 aspects of sensory-motor skills that are encountered in neurosurgery. All the tests were performed on a virtual reality surgical simulator, with high resolution, 3D image presentation, haptic feedback, and collocalization between the image and the robotic tool.5 Spatial Memory The goal of this experiment was to measure the trainee’s ability to mentally recognize, memorize, and reach the 3D location of an ideal target in the vertebral body regarding the position of the ideal entry point. This was determined by requesting the trainees to understand the spatial relationship of the ideal target to the vertebra and define an appropriate drilling trajectory from the ideal entry to the ideal target. The test focuses on the ability to plan and 2

execute a trajectory in 3D space from memory, although hand tremor, the knowledge of anatomy, and the overall dexterity of the individual affect precision and score on this test. At the beginning of the session, participants were shown a 3D animation of a single semitransparent vertebra including the ideal target, rotating 2 complete 3601 turns. After that, the vertebra became opaque, occluding the target point. The Euclidean distance of the drill tip to the target point was computed by the simulator as a measurement of the performance (lower is better). Motor Planning Motor planning—sequence, timing, and precision: hemostasis in a postresection cavity in the brain. As soon as the simulation starts, the virtual bleeding continuously expands until the corresponding vessels are cauterized by the trainees. The participants were instructed to touch the bleeding points for 2 seconds with the tip of the virtual bipolar coagulator, collocated with the haptic device stylus. The period between successful cauterizations was computed and recorded by the system as a measurement of the trainees’ performance. The simulation was terminated when bleeding extends beyond the skull or all vessels have been cauterized. Duration between successful cauterizations was used to compute participants’ performance. Lower score is better in this test. Haptic Perception The users were instructed to touch 3 spheres and to determine which sphere is the softest. Once the softest sphere was found, the users pressed a button while touching the sphere. If the response was correct, the test would progress to the next level. If the response was incorrect, then all the 3 spheres were randomly shuffled, and the users were instructed to try again. A correct response from up to 3 attempts at each level was allowed to progress to the next level. If the users were able to select the correct sphere on their first attempt, 10 points were awarded. If the correct sphere was chosen on the second attempt, then 5 points were awarded. No points were awarded if the correct answer was determined on the third attempt. If the users were unable to arrive at a correct response after the third attempt, the test was terminated. Higher score is better in this test. Each individual’s performance was recorded in Microsoft Excel. In each of the individual tests, performance levels were divided into deciles. The visual appearance of the test on the screen is shown in Figure 1. Combined Score The measurements from all 3 experiments (in mm, s, and points) were normalized, equally weighted, and combined Journal of Surgical Education  Volume ]/Number ]  ] 2015

The result of this calculation gave us a value for each participant. We then averaged all these values to a mean normalized statistic for the whole population. This normalized statistic, we found, would be better represented in the form of a whole number, such as 100. To make this statistic easier to visualize and understand, we found the constant that we would have to multiply the mean normalized cumulative statistic with to make it equal 100. Then, we multiplied the normalized values across the whole data set to give us an adjusted value for each individual. The adjusted value allowed us to stratify the data into groups by calculating the standard deviation (SD). Each group was organized as greater than or less than 1, 2, and 3 SDs from the mean. To better and easily illustrate the data, the x-axis was renamed to increments of 10, as is illustrated in Figure 5. These increments of 10 illustrated 1 SD away from the mean of 100. SMQ was derived in this manner, because making the presentation of SMQ similar to intelligence quotient provides for a more intuitive interpretation.

RESULTS A total of 63 applicants, 10 first-year medical students, and 4 residents completed all tasks, and their data are included. All the participants were able to see the images in 3D. Our analysis focuses on the 2 cohorts of neurosurgery residency applicants: the year 1 (pilot, n ¼ 33) and the year 2 (validation, n ¼ 30) cohorts. Separate Evaluation of the Cohorts of Applicants

The sensory-motor quotient (SMQ) was found by using a series of steps and transformations. First, we normalized all the cumulative individual scores to obtain a value from 0 to 1 for each of the tests. Normalization was done using the following formula: (xminimum)/(maximumminimum).

Haptic perception (units are in points): The year 1 cohort’s mean score was 79.85, median score was 80, maximum score was 130, minimum score was 35, and SD was 22.27. The year 2 (validation cohort) cohort’s mean score was 81.33, median score was 80, maximum score was 130, minimum score was 50, and SD was 20.42. No significant difference was found between the groups (p ¼ 0.784). Motor planning (units are in mm): The year 1 cohort’s mean score was 11.02, maximum score was 18.43, minimum score was 5.13, and SD was 3.53. The year 2 (validation cohort) cohort’s mean score was 11.96, median score was 11.7, maximum score was 19.2, minimum score was 5.67, and SD was 3.26. No significant difference was found between the groups (p ¼ 0.275). Spatial memory (units are in s): The year 1 cohort’s mean score was 14.6, median score was 11.72, maximum score was 48.15, and minimum score was 3.61. The year 2 (validation cohort) cohort’s mean score was 18.9, median score was 17.28, minimum score was 7.23, and maximum score was 70.34. No significant difference was found between the groups (p ¼ 0.13).

Journal of Surgical Education  Volume ]/Number ]  ] 2015

3

FIGURE 1. Trajectory planning (top), motor planning (middle), and haptic perception (bottom).

into a final cumulative score. The haptic feedback test illustrates a better performance when the score is higher. The sensory-motor and motor planning tests illustrate a better performance when the score is lower. To account for this, the inverse of the haptic feedback test was used in the calculation of the composite score. For the combined scores, each participant’s percentile score was used to sort him or her relative to the peers. The scores were plotted to display the distribution of performance in the tested population. Sensory-Motor Quotient Derivation

FIGURE 2. XXX.

Combined results/calculated SMQ: The year 1 cohort’s mean score was 94.17, median score was 84.13, and SD was 39.78. The year 2 (validation cohort) cohort’s mean score was 103.99, median score was 94.84, and SD was 37.15. No significant difference was determined between the groups (p ¼ 0.32). Results for the Entire Tested Population Overall, the performance of the applicants was similar for both the years; therefore, we combined the results from both the years into a single group of all applicants. To those we added data from 10 first-year medical students and 4 residents, to see where the performance by these tested individuals would reduce when compared with the

applicants. The results for spatial memory test are shown in Figure 2. The distances are shown along the x-axis, and the number of participants is shown along the y-axis. The best score was 3.6, worst score was 70.34, mean score was 16.12, median score was 13.97, and SD was 10.61. Motor planning test results are shown in Figure 3. The duration is divided into 10 intervals ranging from 0 to 25 seconds, with each interval covering 2.5 seconds. It indicates the ability to successfully stop a bleeding point and move on to the next one in a timely fashion. The best score was 5.13, worst score was 22.5, mean score was 11.7, median score was 11.11, and SD was 3.58. Haptic perception test results are shown in Figure 4. Increasing score indicates better performance. The minimum

FIGURE 3. XXX. 4

Journal of Surgical Education  Volume ]/Number ]  ] 2015

FIGURE 4. XXX.

score was 35, maximum score was 130, mean score was 79.87, median score was 80, and SD was 20.63. This test measures the ability to differentiate relative hardness of a surface by palpating it with a virtual reality tool. The combined raw score (Fig. 5) represented a combination of all the 3 tests. The variability in the trajectory planning test contributed to a less even distribution of scores. The mean score was 14.47, best score was 8.41, worst score was 30.26, median score was 13.42, and SD was 4.37. Sensory-Motor Quotient We used the basic data contained in the combined score and recalculated it so that median performance was given the score of 100, and each SD was assigned 10 points. The resulting graph presents the distribution within the population of performance on our combined test in a way that

resembles that of intelligence quotient test results. We called this a graph of the “sensory-motor quotient” (Fig. 6).

DISCUSSION Psychometric evaluation of applicants for jobs and studies is well known. Despite the intuitive importance of evaluating relevant sensory-motor skills in applicants for jobs and training in surgical specialties, we are not aware of any literature on systematic fine motor skill testing of prospective surgeons. However, psychometric motor diagnostics have been attempted for athlete selection6,7 Like in the motor skill tests performed in the context of athlete selection, we saw that sensory-motor skills have a normal distribution in the tested population. We demonstrated that our score continues to generate a normal distribution in a

FIGURE 5. XXX. Journal of Surgical Education  Volume ]/Number ]  ] 2015

5

FIGURE 6. XXX.

new cohort, as well as for the combined population, evidence that the SMQ is a reproducible measure of sensory-motor performance in a population and has psychometric qualities. Each test and the combined score produced a wide distribution of performance, with the mean and median scores quite similar and thus supporting normal distribution of the traits measured by our tests. Our data support face validity of our testing approach. Given the consistency of our results over 2 years of testing, we felt justified to present a hypothetical SMQ score that represents a particular subject’s performance when compared with the tested population. We expect the SMQ to be further refined as we add data from more years and additional institutions. The addition of first-year medical students not applying to neurosurgery demonstrated that they tend to perform similar to the neurosurgery applicants. We added data from a few residents, who tended to perform better than the applicants did, and although the numbers are not sufficient for analysis, we can use them to generate a hypothesis. We expect SMQ to be better in residents than in applicants if SMQ includes skills that improve with neurosurgical training. One such test could be “spatial memory.” It proved to be a difficult test for most applicants, with the greatest variability in performance. We suspect that better familiarity with anatomy by the neurosurgery residents resulted in better performance on that test. The lack of difference between first-year students and neurosurgery applicants is consistent with the hypothesis that self-selected applicants did not actually have better sensory-motor skills when compared with other students. However, it is also possible that students who responded to a request to volunteer to test their performance are also selfselected and do not represent a true control group. We attempted to design tests that are relevant to the procedural skills that are required from a neurosurgical resident and also appear relevant and engaging for the applicant. Indeed, applicants liked the simulator. All participants gave it the grade of 4 or 5 (1 worst and 5 best) on our

anonymous survey. The tasks do not require prior experience and require only a brief instruction on using the simulator. As a study among anonymous volunteers, it has limitations —we tested a wide sample of neurosurgery residency applicants, representing most of the applicants to our program. However, it is not a representative sample. The applicant pool may be different for different years if the program popularity changes or if there is a change in the criteria for applicant selection for interview. Self-selection to volunteer to do the tests may have also occurred, as not all applicants completed the test. Self-selection to participate and to complete all the tasks may correlate with perceived surgical skill or the degree of comfort with computers and video games. Video games may improve the motor skills of participants, especially when tested in a gamelike environment on a computer.8-10 Our research helps establish a baseline sensory-motor performance curve for the neurosurgery applicant pool. Our set of sensory-motor tests can also be used by medical students for self-assessment and as a way to engage with neurosurgery and increase awareness of the sensory-motor skills required for this specialty. We speculate that an important initial utility of our test may be in the identification of outliers—persons performing at 3 SDs from the mean, especially less than the mean. We propose that SMQ measures aspects of sensory-motor aptitude that are relevant to neurosurgical trainees. To prove construct validity, we need a large study of neurosurgery resident performance on our test. We hypothesize that they will perform better than the applicants did on the spatial memory task but not on the haptic perception task. Such validation was beyond the scope of current study, but based on the current study, we are developing the next phase to be conducted in the coming years.

6

Journal of Surgical Education  Volume ]/Number ]  ] 2015

CONCLUSIONS Our current work demonstrated that each component of our sensory-motor testing, as well as the combined score

(SMQ), reproducibly generated a normal distribution of results in the tested population; this indicates good psychometric-type properties of our testing. First-year medical students did not perform differently from applicants. Residents tended to perform better than the applicants did, but the numbers are too small to form a conclusion. The next logical step would be to further expand our subject population and include a large sample of neurosurgery residents, to demonstrate construct validity.

4. Williams R, Roberts N, Schwind C, Dunnington G.

Recognizing residents with a deficiency in operative performance as a step closer to effective remediation. Surgery. 2009;145:651-658. 5. Roitberg B, Banerjee P, Luciano C, et al. Sensory and

motor skill testing in neurosurgery applicants: a pilot study using a virtual reality haptic neurosurgical simulator. Neurosurgery. 2013;73(suppl 1):116-121. 6. Höner O, Votteler A, Schmid M, Schultz F, Roth K.

ACKNOWLEDGMENTS This work was supported in part by NIH NINDS Grant, USA 2R44NS066557.

Psychometric properties of the motor diagnostics in the German football talent identification and development programme. J Sports Sci. 2014;20:1-15 [Epub ahead of print]. 7. Russell M, Benton D, Kingsley M. Reliability and

REFERENCES 1. Panait L, Larios JM, Brenes RA, et al. Surgical skills

assessment of applicants to general surgery residency. [Epub May 4, 2011]. J Surg Res. 2011;170(2):189-194. http://dx.doi.org/10.1016/j.jss.2011.04.006. 2. Lee JY, Kerbl DC, McDougall EM, Mucksavage P.

Medical students pursuing surgical fields have no greater innate motor dexterity than those pursuing nonsurgical fields. J Surg Educ. 2012;69(3):360-363. http://dx.doi.org/10.1016/j.jsurg.2011.11.005.

3. Goldberg AE, Neifeld JP, Wolfe LG, Goldberg SR.

Correlation of manual dexterity with USMLE scores and medical student class rank. Epub Mar 26, 2008. J Surg Res. 2008;147(2):212-215. http://dx.doi.org/ 10.1016/j.jss.2008.02.050.

Journal of Surgical Education  Volume ]/Number ]  ] 2015

construct validity of soccer skills tests that measure passing, shooting, and dribbling. J Sports Sci. 2010;28 (13):1399-1408. http://dx.doi.org/10.1080/02640414. 2010.511247. 8. Ou Y, McGlone ER, Camm CF, Khan OA. Does

playing video games improve laparoscopic skills? Int J Surg. 2013. http://dx.doi.org/10.1016/j.ijsu.2013.02. 020 [Epub ahead of print]. 9. Adams BJ, Margaron F, Kaplan BJ. Comparing video

games and laparoscopic simulators in the development of laparoscopic skills in surgical residents. J Surg Educ. 2012;69(6):714-717. 10. Borecki L, Tolstych K, Pokorski M. Computer games

and fine motor skills. Adv Exp Med Biol. 2013;755: 343-348.

7

Evaluation of Sensory and Motor Skills in Neurosurgery Applicants Using a Virtual Reality Neurosurgical Simulator: The Sensory-Motor Quotient.

Manual skill is an important attribute for any surgeon. Current methods to evaluate sensory-motor skills in neurosurgical residency applicants are lim...
804KB Sizes 0 Downloads 14 Views