Authors: James Engel, BS Ajit B. Pai, MD William C. Walker, MD
Education
Affiliations: From the Virginia Commonwealth School of Medicine, Richmond, Virginia (JE); Hunter Holmes McGuire Veterans Administration Medical Center, Richmond, Virginia (ABP, WCW); and Virginia Commonwealth University Medical Center, Richmond, Virginia (WCW).
Correspondence: All correspondence and requests for reprints should be addressed to: James Engel, BS, Virginia Commonwealth School of Medicine, 1825 E Marshall St, Apt 105 Richmond, VA 23223.
ORIGINAL RESEARCH ARTICLE
Can American Board of Physical Medicine and Rehabilitation Part 2 Board Examination Scores be Predicted from Rotation Evaluations or Mock Oral Examinations?
Disclosures: Financial disclosure statements have been obtained, and no conflicts of interest have been reported by the authors or by any individuals in control of the content of this article.
ABSTRACT
0894-9115/14/9312-1051 American Journal of Physical Medicine & Rehabilitation Copyright * 2014 by Lippincott Williams & Wilkins
Objective: The aim of this study was to determine whether either Physical Medicine and Rehabilitation residency performance on core competency evaluations or on practice mock oral examinations is correlated to performance on future American Board of Physical Medicine and Rehabilitation Part 2 board-certifying examination.
DOI: 10.1097/PHM.0000000000000126
Engel J, Pai AB, Walker WC: Can american board of physical medicine and rehabilitation part 2 board examination scores be predicted from rotation evaluations or mock oral examinations?. Am J Phys Med Rehabil 2014;93:1051Y1056.
Design: This is a retrospective cohort study of residents who took part 2 of the American Board of Physical Medicine and Rehabilitation certification examination between 1995 and 2011 (N = 31 or 38 or 67).
Results: The postgraduate year 4 mock oral examination average achieved significance in correlation analysis (Spearman Q, 0.0391; P = 0.030). Patient care and a composite average of the other core competencies evaluations were also significantly correlated with performance on part 2 of the board-certifying examination (Spearman Q, 0.329; P = 0.044). The only independent variable that was uniquely predictive was postgraduate year 4 mock oral examinations (W2 = 7.09; P = 0.029). More specifically, when controlling for rotation performances, residents with higher mock oral examination scores were 9.6 times (Exp B = 9.6; 95% confidence interval, 1.2Y80; P = 0.036) more likely than those one grade lower to achieve the upper half on oral board examinations vs. either of the lower 2 quartiles.
Conclusions: The postgraduate year 4 mock oral examinations and the core competency evaluations composite are each predictive of performance on American Board of Physical Medicine and Rehabilitation part 2 examination. Further research into this area, with a larger sample size and with multiple institutions, would be helpful to allow for a better measurement of these evaluation tools’ effectiveness. Key Words: www.ajpmr.com
Mock Oral, PM&R, Board Examination Part 2, Core Competency
Predicting Board Examination Scores Copyright © 2014 Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.
1051
F
irst-time pass rates on board-certifying examinations are one of the standards by which a residency program is evaluated. Because of this, an important goal of medical residency programs is to maximize board examination performance and pass rates of their residents. To accomplish this, it is important to identify salient predictors of board examination scores so that preparation strategies can be tailored to individual needs. Previous research has shown that performance on American Board of Physical Medicine and Rehabilitation (ABPM&R) part 1 board examination (written board examination) is significantly correlated with previous standardized written test scores, including both United States Medical Licensing Examination and Physical Medicine and Rehabilitation (PM&R) Residency Self-Assessment Examinations.1 However, in that study, performance on the ABPM&R Part 2 board examination (oral board examinations) was not significantly related to these standardized written examinations. Other means of evaluating ability and preparedness for the board examination include mock oral examinations and performance on Accreditation Council for Graduate Medical Education core competencies. Despite receiving some criticisms for being highly subjective, high interrater reliability (0.76Y0.98) has been demonstrated for both mock oral examinations and core competency evaluations among anesthesiology and internal medicine residents, respectively.2,3 Construct validity has also been demonstrated for the overall competency evaluation; a study in a PM&R residency training program showed excellent internal consistency (Cronbach >, 0.98) for 5 of 6 general competency areas.4 Furthermore, the predictive validity of core competency grades has been demonstrated in nonYPM&R residency programs. Among internal medicine residents, intraclass correlation coefficients were greater than 0.80 with respect to the American Board of Internal Medicine (ABIM) certifying examination results.3 Mock oral examinations and core competency evaluations have been shown to correlate to performance on board-certifying examinations in other specialties as well.5Y7 Because ABPM&R oral board examination pass rates are critical to both residents and PM&R residency programs, it would be advantageous to understand what, if any, residency evaluation strategies have predictive validity for the ABPM&R oral board examinations. If there is a correlation, mock oral examination results and/or core competency evaluations can be valuable determinants of
1052
Engel et al.
preparedness for residents passing their board examinations. They could identify weaknesses that need improvement to both serve as motivation for further preparation and to guide formulation of a specific performance improvement plan. To the authors’ knowledge, no previous studies have analyzed correlation between ABPM&R oral board examinations and PM&R residency rotation evaluations. In the Thanjan et al. study,1 mock oral examination bivariate grade (pass/fail) did not predict bivariate grade on the ABPM&R oral board examination. In an effort to determine predictors for performance on ABPM&R oral board examinations, the objective of this study was to determine whether either PM&R residency performance on core competency evaluations or on practice mock oral examinations is related to performance on future ABPM&R oral board examinations.
METHODS After approval by the Virginia Commonwealth University institutional review board, the sample and data were drawn from graduates of a single PM&R residency program between the years of 1995 and 2011. The current mandated core competency domains for evaluating PM&R residents were established in the early 2000s. In the sample, performance on these competencies was available from residents starting with the graduating class of 2003 and on all graduates of 2005Y2011. First, correlations between faculty grades given on PM&R residency core competency evaluations and on mock oral examinations with the ABPM&R part 2 board-certifying examination were assessed. The evaluation form used was specific to the authors’ institution. Rotation grades for each of the six core competencies (patient care, medical knowledge, practice-based learning, interpersonal skills/communication, professionalism, system-based practice) were averaged across each postgraduate year (PGY) level year as well as over the entire residency (PGY 2Y4). A Likert scale was used for both competency grades (1Y9) and mock oral examinations (1Y5, with 1 being definite fail and 5 being definite pass) (Fig. 1). For the ABPM&R part 2 board-certifying examination, quartile ranking was used because full percentile scores were available only beginning in 2009. Because the data were ordered in rank and had skewed distributions, nonYparametric methods were used for correlation analyses. Secondly, for those independent variables that were significantly correlated with the outcome, the authors undertook multivariate regression modeling to determine their relative predictive value.
Am. J. Phys. Med. Rehabil. & Vol. 93, No. 12, December 2014
Copyright © 2014 Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.
RESULTS Core competency evaluations were available for 38 residents, and PGY3 and PGY4 mock oral examinations were available for 67 and 57 residents, respectively. A summary of indicators of overall residency performance in the department is shown by PGY level in Table 1. These include an annualized average of average competency grade for each clinical rotation, mock oral examination, and national self-assessment evaluation scores. Median and range values are presented because some of the variables were not normally distributed. Correlation
coefficients for core competency evaluations and mock oral examination scores to the ABPM&R part 2 board-certifying examination are displayed in Table 2. The total (PGY 2Y4) core competency area of patient care and PGY4 mock oral examinations were each correlated (P G 0.05) with the ABPM&R part 2 board examination (r = 0.35 and r = 0.40, respectively). The PGY3 mock oral examinations had no correlation (P = 0.672). When the six competency evaluation grades were averaged for the PGY-4 level, there were no significant correlations with part 2 (all P 9 0.05). The remaining total core competencies
FIGURE 1 The form used to evaluate the residents during their mock oral examination. www.ajpmr.com
Predicting Board Examination Scores Copyright © 2014 Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.
1053
TABLE 1 Overall residency performance indicators PGY2 Composite PGY3 Composite PGY4 Composite PGY3 PGY4 Competency Competency Competency Mock Oral Mock Oral N Median Range
40 7.3 2.56
41 7.5 3.2
36 7.7 3.3
72 4.0 3.0
PGY2 SAE
PGY3 SAE
PGY4 SAE
82 63.0 98.0
82 60.0 95.0
83 57.0 98.0
60 4.0 3.0
SAE, self-assessment exam.
taken individually also each failed to correlate. However, because there was a trend (P G 0.15) toward correlation for four of five total competency grades, a composite total average of all nonYpatient care competencies was also assessed for correlation. This composite had Bmoderate[ correlation with part 2 of the ABPMR board examination (r = 0.33). The sample size available for regression analysis was 31 residents with ABPM&R part 2 board examination scores available who had complete data on independent variables (rotation grades and PGY4 mock oral examination grades). The most common reason for exclusion was that the residents did not take their PGY4 mock oral examination because of nonavailability on examination day. In this reduced subset, only the PGY4 mock oral examination average achieved significance in correlation analysis (r = 0.39). Type 2 error is suspected for the other independent variables given the small sample size and the fact that the other two variables trended toward significance (P G 0.15). Therefore, multivariate model analysis using the ordinal outcome was not indicated, and instead, multivariate nominal regression analysis was attempted. To do so, it was necessary to collapse the ABPM&R part 2 board examination first and second quartile because of the small cell size in the first quartile (n = 2). The multivariate regression model using the patient care average, composite average of other rotation evaluations, and PGY4 mock oral examination was significant (W2 = 19.2, P = 0.004, Nagelkerke pseudo R2 = 0.522). The only independent variable that was uniquely predictive was (over and above the other independent variables) the PGY4 mock oral examination (W2 = 7.09, P = 0.029). When controlling for rotation performances, residents with higher mock oral examination scores were 9.6 times (Exp B = 9.6; 95% confidence interval, 1.2Y80; P = 0.036) more likely than those one grade lower than them to achieve the upper half on ABPM&R part 2 board examination vs. the lower 2 quartiles. Lastly, because resident performance spanned all quartiles and included two failures, binary logistic regression was performed on the same sample
1054
Engel et al.
(n = 31) to assess for predictors of failure on the ABPM&R part 2 board-certifying examination. Individual competency area total averages as well as PGY3 and PGY4 mock oral examinations were entered, and the model failed to reach significance (P = 0.367). Thus, the two cases of failure could not be differentiated from the 29 cases of passing scores using these variables.
DISCUSSION Successfully passing the board-certifying examination is a common goal for all residents in training and all programs training them. This is the first reported study to show the ability of mock oral examination scores or core competency evaluations to predict a PM&R resident’s success on ABPM&R part 2 board-certifying examination. Findings revealed that PGY4 mock oral examination scores, patient care core competency scores, and a composite average of other core competencies excluding patient care were all predictors of ABPM&R part 2 board-certifying examination scores. The PGY4 mock oral examination provides PM&R residents with the opportunity to practice case vignettes with an expert, offering a training performance improvement tool in addition to an evaluation role. This is of potential high value TABLE 2 Correlations between independent variables (predictors) and dependent variable (outcome; ABPM&R oral board examination quartile ranking) Sample Size Spearman Q Patient care Medical knowledge Practice-based learning Interpersonal skills Professionalism Systems-based practice PGY3 mock orals PGY4 mock orals
Sig
38 38 38
0.349 0.300 0.279
0.032* 0.067 0.09
38 38 38 67 57
0.245 0.166 0.286 0.053 0.399
0.139 0.320 0.082 0.672 0.002*
Sig, significance. *Significant (P G 0.05).
Am. J. Phys. Med. Rehabil. & Vol. 93, No. 12, December 2014
Copyright © 2014 Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.
because there is a difference in the aspects of clinical competence required for the oral vs. written ABPM&R board-certifying examinations.8 The ABPM&R part 2 examination tests patient care including data acquisition, problem solving, and patient management as well as interpersonal/systems management skills, which include systems-based practice as well as interpersonal and communication skills. The PGY4 mock oral examination also tests these areas but requires assimilation of all skills and is assessed in a single compressed time setting rather than through observation over a 1-mo period or longer. In this study, further regression analysis confirmed the importance of the PGY4 mock oral board examination as an indicator of performance on the ABPM&R part 2 board-certifying examination. These findings are in agreement with a number of studies in nonYPM&R residency programs that show that mock oral examinations are helpful evaluation tools by residents and can be accurate evaluators of future success. A survey of radiology residents showed that residents believed that a mock oral board examination helped to determine areas of deficiency in their knowledge base.6 A study using surgical residents showed a positive predictive value of 96.9 in relation to the American Board of Surgery Certifying Examination with a sensitivity of 83.8%.9 Another study using surgery residents saw a 12% increase in pass rates on the ABSCE after implementing Mock Oral Examinations.10 A third study found that scores on mock oral examinations significantly correlated to first-time pass rates for surgical residents on the American Board of Surgery Certifying Examination.5 Therefore, the mock oral examination seems to be an indicator of whether an individual is ready to sit for the ABPM&R part 2 board-certifying examination and may be able to help determine in which areas they are weakest. This is especially important to the residency program’s reputation because the board certification rate of a residency program is viewed as an indicator of the caliber of the program. With the increase in pass rates that mock oral examinations have been shown to provide as well as their ability to indicate when an individual is struggling, these examinations seem to be a valuable training and evaluation tool for any residency program. Similarly, findings from this study on the association of core competency evaluations with ABPM&R part 2 board-certifying examination corroborate a number of studies in other medical specialties indicating that these evaluations can predict future success on standardized testing. One study on internal medicine residents showed that www.ajpmr.com
their core competency evaluations were able to accurately detect general differences among residents in clinical competence when compared with the residents’ scores on the American Board of Internal Medicine certification examination. However, the evaluation form failed to differentiate among the nine evaluated dimensions of clinical care accurately.11 Another study on surgery residents found that core competency evaluations could be correlated with performance on part 1 of the American Board of Orthopedic Surgery Examination and also provided supplemental information regarding professionalism, communication, and patient care skills.12 A third study on radiology residents found a positive correlation in the second-, third-, and fourth-year rotation evaluations and overall scores with performance on the American Board of Radiology examination.13 The results of these studies and those of the current study suggest that core competency evaluations can serve as indicators of future performance on ABPM&R oral board examination. As an example specific to this study’s findings of how this type of information can be used, residents with low patient care evaluations or with low overall composite averages can be identified and actions can be taken to improve the residents’ training and understanding in areas in which they are underperforming with the goal that early intervention can increase their chances of performing well. The primary limitation of this study was the inclusion of only one training program, so the results may not generalize to other PM&R training programs. The sample size was limited, increasing the odds of type 2 error, or false-negative associations. For future analysis, studies should be done with multiple institutes and a larger pool of residents. A second limitation is that the mock oral examination scores from the institution studied are not as standardized as they could be. They are required in May of the third and fourth years of residency, and the grading form is standard. However, the examiner can pose from one to three cases on the basis of PM&R core topics and the grading is subjective. In addition, because the mock oral examination forms are specific to the institution, the results may not generalize to other programs. In addition, it is important to note that this study does not demonstrate that mock oral examinations and core competency evaluations are the sole determinants of ABPM&R part 2 board-certifying examination pass rate. The results of this study found that the PGY4 mock oral examinations explained only 16% variance (R2) in oral board quartile scores. One study using surgery residents showed no significant Predicting Board Examination Scores
Copyright © 2014 Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.
1055
differences in the pass rates between those who prepared with formal mock oral examinations, those with informal mock oral examinations, those with a commercial course, those with combinations of the three, and those with no specialized preparation.14 A second study showed that no subjective evaluations of performance predicted success, including faculty evaluations and mock oral scores.15 This stressed the importance of implementing these testing and evaluation methods correctly and in conjunction with other aspects of training. It would be prudent for future research to evaluate a span of residency programs and a variety of evaluation strategies to determine the best method to improve ABPM&R part 2 board-certifying examination pass rates and predict which residents may be at risk for failing.
CONCLUSIONS These results show that PGY 4 mock oral examinations and the core competency evaluations in patient care are each predictive of performance on ABPM&R part 2 board-certifying examination. Although individually each core competency with the exception of patient care did not correlate, when taken as a composite average, the other core competencies also correlate with the ABPM&R part 2 board-certifying examination. This information can be useful in evaluating a resident’s ability to perform on the board examination as well as noting where there may be gaps in the resident’s knowledge or skill set. Further research into this area, with a larger sample size and with multiple institutions, would be helpful to allow for better measurement of these evaluation tools’ effectiveness. REFERENCES 1. Thanjan L, Pai AB, Walker WC: Correlation of Resident USMLE score, PM&R Self Assessment Examination, and Mock Oral Performance with ABPM&R Board Examination Scores. Am J Phys Med Rehab 2. Schubert A, Tetzlaff JE, Tan M, et al: Consistency, inter-rater reliability, and validity of 441 consecutive mock oral examinations in anesthesiology: Implications for use as a tool for assessment of residents. Anesthesiology 1999;91:288Y98 3. Durning SJ, Cation LJ, Jackson JL: The reliability and validity of the American Board of Internal Medicine
1056
Engel et al.
Monthly Evaluation Form. Acad Med 2003;78: 1175Y82 4. Musick DW, Bockenek WL, Massagli TL, et al: Reliability of the physical medicine and rehabilitation resident observation and competency assessment tool: A multi-institution study. Am J Phys Med Rehabil 2010;89:235Y44 5. Maker VK, Zahedi MM, Villines D, et al: Can we predict which residents are going to pass/fail the oral boards? J Surg Educ 2012;69:705Y13 6. Canon CL, Mulligan S, Koehler RE: Mock radiology oral examination. Acad Radiol 2005;12:368Y72 7. Thompson WG, Lipkin M, Gilbert DA, et al: Evaluating evaluation: Assessment of the American Board of Internal Medicine Resident Evaluation Form. J Gen Intern Med 1990;5:214Y7 8. Turk MA: 2005 Maintenance of Certification StandardSetting Study. ABPM&R Diplomate News 2005;12:1 9. Falcone JL, Gagne DJ, Lee KK, et al: Validity and interrater reliability of a regional mock oral board examination. J Surg Educ 2013;70:402Y407 10. Aboulian A, Schwartz S, Kaji AH, et al: The public mock oral: A useful tool for examinees and the audience in preparation for the American Board of Surgery Certifying Examination. J Surg Educ 2010;67: 33Y6 11. Haber RJ, Avins AL: Do ratings on the American Board of Internal Medicine Resident Evaluation Form detect differences in clinical competence? J Gen Intern Med 1994;9:140Y5 12. Crawford CH, Nyland J, Roberts CS, et al: Relationship among United States Medical Licensing Step I, orthopedic in-training, subjective clinical performance evaluations, and American Board of Orthopedic Surgery examination scores: A 12-year review of an orthopedic surgery residency program. J Surg Educ 2010;67:71Y8 13. Adusumilli S, Cohan RH, Korobkin M, et al: Correlation between radiology resident rotation performance and examination scores. Acad Radiol 2000;7: 920Y6 14. Sako EY, Petrusa ER, Paukert JL: Factors influencing outcome of the American Board of Surgery certifying examination: An observational study. J Surg Res 2002;105:75Y80 15. Shellito JL, Osland JS, Helmer SD, et al: American Board of Surgery examinations: Can we identify surgery residency applicants and residents who will pass the examinations on the first attempt? Am J Surg 2010;199:216Y22
Am. J. Phys. Med. Rehabil. & Vol. 93, No. 12, December 2014
Copyright © 2014 Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.