European Journal of Dental Education ISSN 1396-5883

Dental students’ peer assessment: a prospective pilot study J. Tricio1,2, M. Woolford1, M. Thomas1, H. Lewis-Greene1, L. Georghiou1, M. Andiappan1 and M. Escudier1 1 2

King’s College London Dental Institute, London, UK, Faculty of Dentistry, University of los Andes, Santiago, Chile

keywords peer assessment; peer feedback; Direct Observation of Procedural Skills. Correspondence Jorge Tricio King’s College London Dental Institute Guy’s Hospital Central Office 18th Floor Tower Wing London SE1 9RT, UK Tel: +44 (0)20 71881162 Fax: +44 (0)20 71881159 e-mail: [email protected]

Abstract Introduction: Peer assessment is increasingly used in health education. The aims of this study were to evaluate the reliability, accuracy, educational impact and student’s perceptions of undergraduate pre-clinical and clinical dental students’ structured and prospective Peer assessment and peer feedback protocol. Materials and methods: Two Direct Observation of Procedural Skills (DOPS) forms were modified for use in pre-clinical and clinical peer assessment. Ten year two dental students working in a phantom-heads skills laboratory and 16-year five dental students attending a comprehensive care clinic piloted both peer DOPS forms. After training, pairs of students observed, assessed and provided immediate feedback to each other using their respective peer DOPS forms as frameworks. At the end of the 3-month study period, students anonymously provided their perceptions of the protocol.

Accepted: 8 July 2014 doi: 10.1111/eje.12114

Results: Year 2 and year 5 students completed 57 and 104 peer DOPS forms, respectively. The generalizability coefficient was 0.62 for year 2 (six encounters) and 0.67 for year 5 (seven encounters). Both groups were able to differentiate amongst peer-assessed domains and so detect improvement in peers’ performance over time. Peer DOPS scores of both groups showed a positive correlation with their mean end-of-year examination marks (r ≥ 0.505, P ≥ 0.051) although this was not statistically significant. There was no difference (P ≥ 0.094) between the end-of-year examination marks of the participating students and the rest of their respective classes. The vast majority of both groups expressed positive perceptions of the piloted protocol. Discussion: There are no data in the literature on the prospective use of peer assessment in the dental undergraduate setting. In the current study, both pre-clinical and clinical students demonstrated the ability to identify those domains where peers performed better, as well as those which needed improvement. Despite no observable educational impact, most students reported positive perceptions of the peer DOPS protocol. Conclusions: The results of this pilot study support the need for and the potential benefit of a larger- and longer-term follow-up study utilising the protocol.

Introduction The demand to develop dentists who are self-directed, life-long learners and reflective practitioners (1, 2) has stimulated the development and use of alternative assessment forms. These include peer assessment and peer feedback which aim to ª 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd

develop these skills and support students’ learning (3–6). In this context, peer assessment is an arrangement that involves observation by students who have attained the same general level of training or expertise and status, to judge structured tasks or provide global impressions of the amount, level, value, worth, quality or success of their peers’ work (7–9). 1

Peer assessment of dental students

Students’ formative Peer assessment can successfully focus on the provision of objective and immediate feedback (10, 11), and can enhance the students’ learning process in several ways. It can cultivate high levels of student responsibility (4), encourages diplomatic criticism (12) and peer integration (13). It can also facilitate greater student involvement in their learning development (3) and increase their familiarity with evaluation criteria (14), whilst at the same time helping them to overcome unrealistic expectations (13). Importantly, it also encourages reflection and lifelong learning (15), as well as critical skills (5, 16), and by so doing can improve students’ overall performance (17). However, there remain a number of limitations of Peer assessment. These include ‘friendship’ or collusive marking (18), students not accepting Peer assessment and peer feedback as accurate and helpful (19), as well as a reluctance to accept any responsibility for assessing or criticising their friends (20). Nevertheless, most of the students enjoy (21), see the benefits of (22), and are in favour of peer assessment (23) as a fair and valid assessment (24), provided it has an open and clear rationale (13) with pre-known guidelines and criteria (10). The implementation of and criteria used in peer assessment of clinical performance using standardised forms (questionnaires) are well documented for medical undergraduate students (25), foundation and postgraduate medical trainees (26), revalidation of medical career grade (27), dental postgraduate trainees (18) and dental undergraduate students (22, 28, 29). However, no study has been published on the prospective use of peer assessment in dental undergraduate students despite the potential for dental peers to contribute to each other’s learning process. The basis for this benefit is their frequent exposure and hence detailed knowledge of each other’s work, in a variety of contexts, which is not always available to faculty members (3). Students also have the advantage of observing each other performing the complete task or procedure under real conditions (30). They are therefore uniquely placed to formally assess each other fairly and accurately (4) with the added benefit of a stress-free environment (18). As part of a chronological line of research on peer assessment at King’s College London Dental Institute (KCLDI) and before implementing a larger- and longer-term follow-up investigation, this study reports the development and piloting of a structured protocol of formative prospective peer assessment of pre-clinical and clinical dental students’ skills, used as a framework for the provision of immediate peer feedback. The aims were: To evaluate the reliability and educational impact of undergraduate pre-clinical and clinical dental students’ structured and prospective peer assessment. To investigate students’ perceptions of the suitability of the assessed domains, feasibility for future use, identification of learning needs, and acceptability and fairness of the prospective peer assessment and peer feedback protocol.

• •

Materials and methods Ethical approval The study received full ethical approval from the King’s College London Biomedical Sciences, Dentistry, Medicine and Natural 2

Tricio et al.

& Mathematical Sciences Ethical Committee (number BDM/11/ 12-21).

Developing the instrument Two standard Direct Observation of Procedural Skills forms (31, 32) were used as templates to develop pre-clinical and clinical peer assessment tools [peer DOPS (Direct Observation of Procedural Skills)]. These formed a framework for a structured protocol of prospective peer assessment and peer feedback of undergraduate dental students’ pre-clinical competence and clinical performance. Changes to the original templates included a new general layout, replacement of the traditional norm-referenced assessment scale (Below expectation, Borderline, Meets expectations and Above expectations) (33), which might prove difficult for junior students to make judgements about their peers’ performance quality (8), with a criterion-referenced one containing four written descriptions based on the needed frequency of clarification (Frequent, Some, Very Little and No Clarification, Warning and/or Assistance) (32). An ‘unable to comment’ option when a given behaviour was not observed was also included. The assessment domains were different for pre-clinical and clinical peer assessment. Based on blueprinting principles (34), pre-clinical peer DOPS contained 10 non-compounded items (35) (representing the main learning outcomes of the KCLDI year 2 coursebooks), designed for the purpose of peer assessment of any training procedure performed at the simulation skills laboratory. Similarly, clinical peer DOPS (representing the main learning outcomes of the year 5 coursebooks) were intended for peer assessment of whichever clinical procedure students performed on their patients. Both forms also incorporated an assessment of ‘Students’ insight into their performance’ (36) and a 6-point Likert scale for students to rate the utility of giving/receiving feedback as a technique to improve their future performance. Written instructions on how to complete the forms and a wider explanation of the grading scale were also included. Both peer DOPS drafts were reviewed by five internal preclinical and clinical teachers (each of whom had at least 7 years of teaching experience) to ensure they sampled all the relevant domains (37). Following this, Bachelor of Dental Surgery (BDS) year 2 and year 5 students were also asked to review the wording and content of the forms and then use them once before feeding back. This process identified two areas of student concern. The first related to the new criterion-referenced scale and the need to grade the frequency of peer ‘clarification’ whilst working. This was felt to negatively affect peer-collaboration as they would refrain from asking questions in order to obtain a better assessment. As this was not the intention of the exercise, the criterion-referenced scale was changed to a sixoption educationally referenced one containing a graphical and written anchor of the desired ‘increasing ability over time’ (Fig. 1) starting at the beginning of their respective training year, to facilitate understanding and hence use by junior students (38, 39). For example, when a student first performs a practical task, they would be peer-rated as ‘starting to develop’ the ability for that task. Subsequently, he or she would ideally ª 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd

Tricio et al.

Peer assessment of dental students

Stages of Progress Achievement Beginning BDS 2-5 Training

Starting to develop

Show initial capability

Endpoint BDS 2-5 Training

Increasing ability over time

Show constant acceptable ability

Show constant clear ability

Show constant good ability

Show constant extremely good ability

Unable to comment

Fig. 1. Six-point educationally referenced scale used in both pre-clinical Bachelor of Dental Surgery (BDS) year 2 and clinical BDS year 5 peerDirect Observation of Procedural Skills instruments which asks the ‘observing’ student to judge their peer’s ability over time.

progress to ‘initial capability’, followed by ‘constant acceptable’, ‘constant clear’, ‘constant good’ and finally ‘constant extremely good’ ability. Consequently, the new anchor did not require students to make any judgements about the quality of performance or frequency of clarification (Fig. 1). The second students’ concern was the use of the word ‘assessor’ in relation to the ‘observing’ student, as they were unhappy to be ‘assessing’ their peers. In view of this, all references to ‘assessor’ were replaced by ‘observer’ (Fig. 2).

Administration and data collection Participants In January 2012, 26 invited students (18 females and 8 males, aged 18–40, mean = 24.3, SD = 5.9) comprising two groups, consented to participate in this peer assessment and peer feedback pilot study. The first group consisted of 10 pre-clinical BDS 2 Conservative Dentistry students who were under a single clinical supervisor and working at neighbouring phantom heads. The second group comprised 16 clinical BDS 5 Primary Dental Care (PDC) students who worked on the same day of the week as clinical partners (alternating dentist/assistant roles). Peer assessment training At the start of the study, each group of students received a 45-min peer assessment training and familiarisation session delivered by the same researcher (JT) relating to observation, peer assessment, peer feedback, action plan and completion of the instrument. Using written/video examples and role-playing, they learnt and practised how to give and receive confidential, brief, constructive, task-focused and immediate dialogic feedback (9), using their peer DOPS domains as a framework (40). BDS 2 students (organised in fixed pairs) working at neighbouring phantom heads and BDS 5 clinical partners (randomly allocated each session) acted as ‘observer’ and ‘trainee’, respectively, during the first half of the day and then switched roles during the second half of the day. Peer assessment piloting During six occasions, BDS 2 students performed their own procedures as normal whilst ‘observing’ their peers’ pre-clinical ª 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd

work every 15 min to avoid interfering with their own work. BDS 5 students performed their usual clinical activities in pairs so that the assistant student ‘observed’ the dentist student whilst treating the patient together, on seven occasions. The observed procedure was then used to score each of the respective pre-clinical or clinical peer DOPS domains selecting and ticking one of the six options of the educationally referenced scale (Figs 1 and 2) for every domain. If a given behaviour was not observed, they ticked the ‘unable to comment’ option. These scores provided a grounded framework to provide informed feedback. Subsequently, they agreed an appropriate action plan to address any developmental needs (41). Finally, after signing the forms, students self-reflected on the feedback and noted their thoughts in a private reflection diary. Students’ perceptions To investigate students’ perceptions of the prospective peer assessment and peer feedback protocol, during the final session of peer assessment, both groups anonymously answered the following four questions using a 5-point Likert scale (Strongly agree, Agree, Neutral, Disagree and Strongly disagree). To what extent do you agree that the peer assessment and peer feedback protocol used in this study: (i) Assessed you in areas that correspond to your activity in the pre-clinic/clinic? (ii) Could be introduced in the future to all students at KCLDI as part of their pre-clinical/clinical education? (iii) Have helped you to identify learning needs and to improve your performance? (iv) Was acceptable and fair?

Statistical analysis All peer DOPS forms data were manually digitised by the same researcher (JT) into a spread sheet. To analyse students’ peer assessment scores, each of the six levels of ‘increasing ability over time’ of the educationally referenced scale was assigned a numerical value from 1 to 6. Thus, the ‘starting to develop’ initial stage of ability was given a score 1; the ‘show initial capability’ a score 2 and so on until the highest “show constant extremely good ability” which was given a score 6. Subsequently, scores were checked for normality assumptions using histogram and box plot before carrying out any parametric analysis. The reliability of both peer DOPS tool scores was assessed independently using generalizability coefficient. Thus, a crossed three-facet [10 students (s) 9 6 occasions (o) 9 11 items (i)] random effects for BDS 2 (fixed pairs of students throughout the study) and a nested three-facet (16 students 9 7 occasions 9 13 items) random effects for BDS 5 (random pairs of students) were used. Descriptive statistics were used to summarise peer assessment scores, peer-observation time, peer feedback time and the utility of giving/receiving feedback. The same method was used to describe students’ perceptions of the prospective peer assessment and peer feedback protocol. When comparing various measures observed for BDS 2 and BDS 5 groups, independentsamples t-test was used. To compare the scores students gave to each other to their high-stakes marks, a Pearson correlation analysis between BDS 3

Peer assessment of dental students

Tricio et al.

Fig. 2. Modified clinical peer-Direct Observation of Procedural Skills form used for Bachelor of Dental Surgery (BDS) year 5 peer assessment. Students had to complete all XVII items of the instrument at each encounter.

2 and BDS 5 students’ peer DOPS scores and their respective official end-of-year mean examination mark was performed. Further, to investigate a possible effect of the peer DOPS exercise on participating students’ academic performance, independent-samples t-test was used to compare the end-of-year mean examination marks of the 26 BDS 2 and BDS 5 students who 4

used the peer assessment protocol with the rest of their respective classes. One-way Analysis of Variance (ANOVA) was carried out to compare the mean scores of the 11 items (domains) from BDS 2 peer assessment and the 13 items from BDS 5 peer assessment, separately. Where the ANOVA showed significant results, ª 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd

Tricio et al.

a post hoc analysis was carried out using Tukey’s test. The total peer DOPS scores observed at various time points were compared using repeated-measures ANOVA for BDS 2 and BDS 5 groups, separately. All analyses were carried out using SPSS version 19 (SPSS Inc. IBM, Chicago, IL, USA) except for the generalizability coefficient which was calculated using the software EduG 6.1e (Neuchatel, Switzerland).

Results In line with current best practice (8), one of the researchers (JT) carefully organised, delivered and monitored the whole piloting peer assessment process. Thus, starting on February 2012 and during six fortnightly occasions for BDS 2 and seven for BDS 5, students observed, assessed and provided feedback to one another, completing 57 and 104 peer DOPS forms, respectively (BDS 2 mean = 5.7 per student and BDS 5 mean = 6.5 per student). BDS 2 students were peer-assessed across seven different pre-clinical procedures, ranging from composite, amalgam and temporary restorations to root canal treatments and direct veneers. BDS 5 students were assessed for 19 clinical procedures, including oral health instruction, impression, bite and face-bow registration, composite, amalgam and temporary restorations, crown, bridge and veneer preparation and cementation, root canal treatments, wax try-in and root surface debridement. Considering all peer assessment single scores from both groups, on all occasions, they ranged from 2 (show initial capability) to 6 (show constant extremely good ability) (mean = 5.0, SD = 0.7 mode = 5) and were normally distributed. The Generalizability coefficient for BDS 2 was 0.62 for six encounters, whereas for BDS 5, it was 0.67 for seven encounters. The variance analysis for the BDS 2 peer DOPS tool showed a maximum of 36.9% of the variance attributed to undifferentiated error, followed by the student’s component (20.7%), students and items interaction (16.5%), occasions (8.7%), occasion and items (6.9%), students and occasions (6.2%) and items (4.1%). Similarly, for the BDS 5 peer DOPS tool, the larger percentage of variance was the undifferentiated error which accounted for 60.3%, followed by occasions and items (15.8), students and occasions (6.5%), students and items (5.7%), occasions (5.0%), students (3.6%) and items (3.1%). The overall mean peer assessment scores for BDS 2 and BDS 5 groups, along with their respective peer-observation time, peer feedback time and utility of giving/receiving feedback, are presented and compared in Table 1. The mean peer assessment scores for each of the 11 BDS 2 peer DOPS domains (Table 2) showed significant differences (F = 3.94, P < 0.0001) between these 11 items, with the post hoc analysis using Tukey’s test revealing the better-performed ‘Observing aseptic technique. . .’ (item 6) was significantly (P = 0.04) different from all other items. Similarly, BDS 5 mean peer assessment scores for the 13 peer DOPS items (Table 3) also differed significantly (F = 6.55, P < 0.0001). Peer’s scores for ‘Consideration of patients/professionalism’ (item 11) were statistically higher than all other items (P = 0.02). The prospective peer DOPS marks of every fortnightly assessment occasion for pre-clinical (BDS 2) and clinical (BDS 5) students are shown in Fig. 3. The repeated-measures ANOVA ª 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd

Peer assessment of dental students

TABLE 1. Mean and standard deviation (SD) of the peer-Observation and peer Feedback times (minutes), Overall peer assessment score (scale 1–6) and students’ perception of the Utility of giving and receiving feedback to improve future performance (scale 1–6), for each of the studied groups. The statistical significance of the difference (paired t-test) between pre-clinical Bachelor of Dental Surgery (BDS) year 2 and clinical BDS year 5 students is also presented

Variables

BDS 2

Observation time Feedback time Overall peer assessment score Utility of giving feedback Utility of receiving feedback

153.2 6.7 4.8 4.8 5.1

P value of difference

BDS 5 (28.2) (2.9) (0.8) (1.0) (0.8)

100.2 4.8 5.6 5.0 5.3

(21.9) (1.9) (0.8) (0.6) (0.5)

Dental students' peer assessment: a prospective pilot study.

Peer assessment is increasingly used in health education. The aims of this study were to evaluate the reliability, accuracy, educational impact and st...
370KB Sizes 0 Downloads 4 Views