JEADV
DOI: 10.1111/jdv.13159
SHORT REPORT
Evaluation of the educational climate for specialty trainees in dermatology J.M.R. Goulding,1,*, V. Passi2 1
Heart of England NHS Foundation Trust, Birmingham, UK Warwick University Medical School, Coventry, UK *Correspondence: J. Goulding. E-mail:
[email protected] 2
Abstract Background Dermatology specialty trainees (STs) in the United Kingdom (UK) are few in number and will join a thinly spread national consultant body. It is of paramount importance to deliver training programmes of the highest quality for these doctors, central to which is the establishment and maintenance of an educational climate conducive to learning. Objective To conduct a pilot study to evaluate the educational climate for dermatology STs in one UK deanery (West Midlands). Methods Secondary analysis of published data was performed, from the UK’s General Medical Council (GMC) national training survey, and the Job Evaluation Survey Tool (JEST) administered by the West Midlands deanery. A modified online version of the Postgraduate Hospital Educational Environment Measure (PHEEM) was circulated among dermatology STs. Results The GMC’s survey data show that UK dermatology STs rated their training highly in comparison with undifferentiated UK postgraduate trainees. West Midlands dermatology STs (n = 22) scored very similarly to UK dermatology STs. The JEST gave broadly encouraging results, with 21/22 (95%) happy to recommend their posts to colleagues. The modified PHEEM yielded a global mean score of 96.5/152, attracting the descriptor ‘more positive than negative but room for improvement’. Conclusion Despite inherent methodological limitations, the GMC, JEST and modified PHEEM surveys have revealed useful comparative triangulated data which allows the conclusion that West Midlands dermatology STs seem to be training in a favourable educational climate. This represents an important facet of the quality assurance process for medical education, and allows insight into areas which may require improvement. Received: 16 December 2014; Accepted: 25 March 2015
Conflicts of interest Nil to declare.
Funding Nil to declare.
Introduction The overarching aim of any form of medical education has to be that the process ultimately yields high-quality patient care. In the United Kingdom (UK), dermatology specialty trainees (STs) are few in number, and set to join a thinly spread national consultant body. Central to the business of producing fully fledged consultants is the establishment and maintenance of an educational climate conducive to learning. It has been stated that ‘. . .establishing. . .this climate is almost certainly the most important single task of the medical teacher’.1 A series of studies has suggested that educational climate may be a powerful determinant of learners’ attitudes and behaviours, and their future success and satisfaction.1–3
JEADV 2016, 30, 951–955
Definitions and synonyms abound to describe what constitutes the educational climate, which refers to the often intangible qualities of a training environment.4 Over the years, several attempts have been made to design and administer instruments to measure the climate of educational environments: a recent systematic review found that 31 separate indices have been developed for use with various professional groups in different settings around the world.5 Most commonly such tools are employed in comparative evaluation exercises, either within or between institutions. This can be valuable for internal formative development, as well as benchmarking for external scrutiny. There is very limited relevant dermatology-specific literature in this field. Generic surveys of favoured elements of dermatology
© 2015 European Academy of Dermatology and Venereology
Goulding and Passi
952
residents’ existing programmes have been conducted in the United States,6–8 Canada9 and France.10 Satisfaction with training in individual components of the curriculum such as surgical11 and cosmetic12 dermatology has also been assessed. A Dutch study reviewed dermatology residents’ opinions on what would constitute the ideal theoretical educational programme, and interestingly statements pertaining to climate featured in four of the top five work-related issues.13 Educational climate is also referenced obliquely in a questionnaire survey of British dermatology STs’ views on the most important and least desirable attributes of effective trainers.14 To the best of our knowledge there have been no studies reported evaluating the educational climate for dermatology STs.
Table 2 2012 GMC survey data: West Midlands deanery dermatology STs vs. all UK dermatology STs Indicator
West Midlands dermatology STs
N
UK dermatology STs
N
Access to educational resources
74.24
19
73.55
189
Adequate experience
83.16
19
84.35
191
Clinical supervision
93.47
19
90.87
191
Educational supervision
85.53
19
87.74
191
Feedback
75.44
19
78.12
179
Handover
45.00
5
40.08
63
Induction
84.21
19
85.45
190 191
Local teaching
60.58
19
63.79
Materials and methods
Overall satisfaction
85.26
19
84.29
191
The aim of this pilot study was to evaluate the educational climate for dermatology STs in one UK deanery (West Midlands).
Regional teaching
83.46
19
74.37
167
Study leave
59.91
18
70.06
190
Undermining
94.03
19
95.31
177
Secondary analysis of existing data
Work load
61.51
19
55.18
191
The UK’s General Medical Council (GMC) has conducted and published an annual national survey of all postgraduate medical trainees since 2005, to enhance training at the level of the local education provider. In addition, the West Midlands deanery circulates its own annual trainee survey, the Job Evaluation Survey Tool (JEST). We reviewed data from the 2012 GMC and JEST surveys.
All scores shown reflect mean values; maximum possible score for each indicator = 100; N = number.
To acquire novel data, we used the Postgraduate Hospital Educational Environment Measure (PHEEM).15 All demographic
questions were removed to preserve anonymity. Questions 7 and 13 ask about exposure to racism and sex discrimination, respectively. These were excluded on ethical grounds, namely the inability to match concerning responses to individual units in the absence of demographic data. The maximum possible global score for the PHEEM, and its subsequent interpretation, is therefore adjusted. An email was sent to all regional dermatology STs in February 2013 with an online link to access the survey via
Table 1 2012 GMC survey data: UK dermatology STs vs. all UK trainees
Table 3 2012 JEST questionnaire data for West Midlands deanery dermatology STs
Questionnaire survey
Indicator
UK dermatology STs
N
All UK trainees
N
Question
Theme
Mean score
N
1
Patient safety
4.05
22
2
Programme director’s planning
3.68
22
3
Induction
3.41
22
Access to educational resources
73.55
189
67.47
50 230
Adequate experience
84.35
191
80.26
51 090
4
Appraisal and assessment
4.18
22
Feedback
3.64
22
Clinical supervision
90.87
191
87.99
50 966
5
Educational supervision
87.74
191
86.86
51 029
6
Protected teaching
3.91
22
7
Service-based teaching
3.77
22
Feedback
78.12
179
75.66
43 038
8
Senior doctor cover
3.86
22
Handover
40.08
63
64.88
41 498
9
Clinical workload
3.86
22
Induction
85.45
190
83.18
50 937
10
Evidence-based medicine and audit
3.73
22
Local teaching
63.79
191
62.54
51 090
11
Inappropriate tasks
3.68
22 22
Overall satisfaction
84.29
191
80.37
51 090
12
Rota compliance
3.77
Regional teaching
74.37
167
70.42
32 739
13
Accommodation and catering
3.55
22
Study leave
70.06
190
66.40
39 650
14
Leave
3.77
22
15
Junior doctors’ forum
3.18
22
Undermining
95.31
177
93.91
49 010
Work load
55.18
191
46.39
51 090
All scores shown reflect mean values; maximum possible score for each indicator = 100; N = number.
JEADV 2016, 30, 951–955
Recommend post? Yes = 21 No = 1 Score range: 5 = excellent, 4 = good, 3 = acceptable, 2 = poor, 1 = unsatisfactory; N = number.
© 2015 European Academy of Dermatology and Venereology
Educational climate for dermatology trainees
953
Table 4 Modified PHEEM survey data for West Midlands deanery dermatology STs
Table 4 Continued Question
Theme
Mean score
N
35
Encouraged to be an independent learner
2.95
19
19
36
Good counselling opportunities if fail
1.58
19
2.63
18
37
Good feedback on strengths and weaknesses
1.74
19
Informative induction programme
1.89
19
38
Atmosphere of mutual respect
2.63
19
5
Appropriate level of responsibility
3.21
19
6
Good clinical supervision at all times
2.58
19
7*
I have to perform inappropriate tasks
2.53
19
8
Informative registrar handbook available
2.37
19
9
Teachers have good communication skills
3.00
19
10*
I am bleeped/ phoned inappropriately
2.37
19
11
Able to participate in educational events
3.11
19
12
Clear clinical protocols
2.26
19
13
Clinical teachers are enthusiastic
2.89
19
14
Good collaboration with other doctors
2.89
19
15
My hours conform to the EWTD
3.11
19
16
Opportunity to provide continuity of care
2.21
19
17
Suitable access to careers advice
2.00
19
18
Good quality facilities/ accommodation
1.84
19
19
Access to an educational programme
2.63
19
20
Regular feedback from seniors
2.16
19
21
Clinical teachers are well organized
2.83
18
22
I feel physically safe
3.00
19
23
There is a no-blame culture
2.32
19
24
Adequate catering facilities
2.58
19
25
Enough clinical learning opportunities
2.84
19
26
Clinical teachers have good teaching skills
2.95
19
27
I feel part of a team working here
2.74
19
28
Opportunities to perform practical procedures
2.47
19
29
My clinical teachers are accessible
2.63
19
30
My workload in this job is fine
2.89
19
Secondary analysis of JEST data
31
Senior staff utilize learning opportunities
2.37
19
32
I feel ready to be a consultant
2.26
19
33
Clinical teachers have good mentoring skills
2.26
19
34
I get a lot of enjoyment out of my present job
2.84
19
All 22 West Midlands’ dermatology STs completed the online JEST questionnaire. Table 3 shows mean scores for the 15 questions asked, the majority of which are encouraging, especially in view of the overwhelming consensus (21/22 = 95%) recommending their posts to colleagues. Perhaps the most striking feature of these results is their ordinariness: nothing is rated
Question
Theme
Mean score
N
1
Contract of employment
2.89
19
2
Clinical teachers set clear expectations
2.05
3
I have protected educational time
4
JEADV 2016, 30, 951–955
Score range: 4 = strongly agree, 3 = agree, 2 = uncertain, 1 = disagree, 0 = strongly disagree. *Items 7 and 10 are negative statements hence scoring reversed: 4 = strongly disagree, 3 = disagree, 2 = uncertain, 1 = agree, 0 = strongly agree; EWTD = European working time directive; N = number.
SurveyMonkeyâ. A comprehensive Participant Information Leaflet was attached, and consent assumed for those submitting a completed questionnaire. Ethical approval for all aspects of the study was confirmed by the University of Warwick’s Biomedical and Scientific Research Ethics Committee.
Results Secondary analysis of GMC data
Table 1 shows how UK dermatology STs rated their training according to the 13 key indicators set by the GMC, in comparison with undifferentiated UK postgraduate trainees. UK dermatology STs recorded higher mean scores for all but one of the indicators. The glaring discrepancy concerns handover, with an apparently significant reduction in mean score for dermatology STs. This may be explained by the outpatient-based nature of dermatology where handover within and between shifts is less of an issue. The markedly low mean scores for workload in both groups also stand out. This implies a perception of excessive workload among trainees, which is something of a surprise for the dermatology cohort where hours are usually well regulated. Table 2 compares data from dermatology STs in the West Midlands deanery with all UK dermatology STs. Mean scores for the majority of indicators appear very similar between these two groups, though apparently significant divergence is noted with regional teaching (rated better by West Midlands STs) and study leave (rated worse).
© 2015 European Academy of Dermatology and Venereology
Goulding and Passi
954
significantly better than ‘good’. There is some variation, with induction, accommodation and catering, and the junior doctors’ forum scoring the lowest, suggesting room for improvement in these areas. Modified PHEEM survey results
19/22 (86%) regional dermatology STs completed and submitted the modified PHEEM survey. Table 4 shows mean scores for individual question items, calculated according to the scoring system described by Roff et al.15 The adjusted global mean score is 96.5/152, interpreted as ‘more positive than negative but room for improvement’. A score ≥115/152 would have been required for an ‘excellent’ rating. Roff et al.15 describe three sub-scales within the overall PHEEM, comprising groups of relevant question items. The ‘perceptions of role autonomy’ sub-scale scored 35.8/56, reflecting ‘a more positive perception of one’s job’ (≥43/56 = ‘excellent’). The ‘perceptions of teaching’ sub-scale scored 39.4/60, interpreted as ‘moving in the right direction’ (≥46/60 = ‘model teachers’). Finally, the adjusted ‘perceptions of social support’ sub-scale scored 21.3/36, yielding a rating of ‘more pros than cons’ (≥28/ 36 = ‘a good supportive environment’). Overall these results are encouraging, with several individual item responses and sub-scale ratings worthy of praise. However, the scoring overall appears less enthusiastic in comparison with GMC and JEST data. Induction and accommodation/facilities are again rated poorly, alongside STs’ perceptions of counselling opportunities and feedback from senior staff.
Discussion The wealth of studies published seeking to generate and validate reproducible and relevant indices shows that medical educational climate is an important and meaningful concept, both for trainees and educators. There is, however, limited evidence and a lack of consensus as to the global concrete benefits of measuring educational climate. Secondary analysis of both GMC and JEST survey results has revealed useful comparative data and highlights several areas of good educational practice within the specialty. A number of issues worthy of further investigation were also unearthed. The extent to which these surveys truly inform an evaluation of educational climate has to be questioned, since they were not explicitly designed with this in mind. Nonetheless, taken alongside the PHEEM results, an attempt has been made to triangulate and cross-reference data. According to the literature, the PHEEM seems the most appropriate and widely tested instrument to evaluate educational climate in the postgraduate training environment.5 Content and construct validity have been affirmed by some16, but disputed by others.17 Several studies have confirmed very high levels of reliability, though it must be pointed out that the sample size in the current study falls short of the minimum
JEADV 2016, 30, 951–955
recommended number of participants (at least three respondents from at least 10 departments).17 Respondents to both GMC and JEST surveys were required to divulge their current workplace and/or their educational supervisor. Allied to this is a barely disguised element of coercion where lack of completion is threatened to prevent annual training progression. These factors lead one to question the veracity and completeness of data derived from these surveys, and suggests that the truly voluntary modified PHEEM offers ST views that are more representative of reality. A problem common to all three surveys is the small regional sample size, which is inevitable with a sparsely populated specialty such as dermatology. Although the response rates are exceptional, there are potential concerns regarding the reliability and generalizability of results. Rather than relying on a single snapshot assessment of educational climate, several authors have conducted cohort or longitudinal studies, using various measures.18,19 Common to each of these is the additional perspective gained through the passage of time, and the opportunity to monitor and improve training incrementally. Adopting a purely qualitative approach allows greater depth of analysis, though a mixed-methods design may allow time and resource constraints to be mitigated. An element missing from the overwhelming majority of studies is the view from faculty, which is likely to bring balance and different perspectives to bear. The recent reintroduction in pilot form of a national GMC survey of UK trainers is pertinent in this regard. No matter what methodology is used, the ultimate aim of any evaluation of educational climate must be to effect meaningful change where necessary. It has been demonstrated that repeated climate measurements over time can draw attention to existing and emerging deficiencies in training. Subsequent targeted intervention may then reverse decline, particularly if survey results are harnessed to garner support for specific initiatives.18 In summary, despite inherent methodological limitations, the data presented allows the conclusion that dermatology STs seem to be training in a favourable educational climate in our region. There is clearly scope for improvement and a range of issues are raised, which may benefit from further investigation and future re-audit. Studies of this type constitute an important facet of the medical education quality assurance process, both from specialty-specific and national regulatory standpoints.20
References 1 Genn JM, Harden RM. What is medical education here really like? Suggestions for action research studies of climates of medical education environments. Med Teach 1986; 8: 111–124. 2 Genn JM. AMEE, Medical Education Guide No. 23 (Part 2): curriculum, environment, climate, quality and change in medical education – a unifying perspective. Med Teach 2001; 23: 445–454. 3 Lizzio A, Wilson K, Simons R. University students’ perceptions of the learning environment and academic outcomes: implications for theory and practice. Stud High Educ 2002; 27: 27–52.
© 2015 European Academy of Dermatology and Venereology
Educational climate for dermatology trainees
4 Roff S, McAleer S. What is educational climate? Med Teach 2001; 23: 333–334. 5 Soemantri D, Herrera C, Riquelme A. Measuring the educational environment in health professions studies: a systematic review. Med Teach 2010; 32: 947–952. 6 Webb JM, Rye B, Fox L et al. State of dermatology training: the residents’ perspective. J Am Acad Dermatol 1996; 34: 1067–1071. 7 Freeman SR, Greene RE, Kimball AB et al. US dermatology residents’ satisfaction with training and mentoring: survey results from the 2005 and 2006 Las Vegas Dermatology Seminars. Arch Dermatol 2008; 144: 896–900. 8 Vashi NA, Latkowski J-A. The current state of dermatology training: a national survey of graduating dermatology residents. J Am Acad Dermatol 2012; 67: 1384–1386. 9 Freiman A, Barzilai DA, Barankin B et al. National appraisal of dermatology residency training: a Canadian study. Arch Dermatol 2005; 141: 1100–1104. 10 Plee J, Barbe C, Richard MA et al. Survey of post-graduate training for dermatology and venereology residents in France (2005-2010). Ann Dermatol Venereol 2013; 140: 259–265. 11 Lee EH, Nehal KS, Dusza SW et al. Procedural dermatology training during dermatology residency: a survey of third-year dermatology residents. J Am Acad Dermatol 2011; 64: 475–483. 12 Group A, Philips R, Kelly E. Cosmetic dermatology training in residency: results of a survey from the residents’ perspective. Dermatol Surg 2012; 38: 1975–1980.
JEADV 2016, 30, 951–955
955
13 Vissers WHPM, van Meurs T, Goldschmidt WFM et al. Residents’ perspectives on dermatology training in Dutch university medical centres in 2006. Br J Dermatol 2008; 159: 736–738. 14 Farrant P, Cohen SN, Burge SM. Attributes of an effective trainer: implications of the views of U.K. dermatology trainees. Br J Dermatol 2008; 158: 544–548. 15 Roff S, McAleer S, Skinner A. Development and validation of an instrument to measure the postgraduate clinical learning and teaching educational environment for hospital-based junior doctors in the UK. Med Teach 2005; 27: 326–331. 16 Riquelme A, Herrera C, Aranis C et al. Psychometric analyses and internal consistency of the PHEEM questionnaire to measure the clinical learning environment in the clerkship of a Medical School in Chile. Med Teach 2009; 31: e221–e225. 17 Boor K, Scheele F, van der Vleuten C et al. Psychometric properties of an instrument to measure the clinical learning environment. Med Educ 2007; 41: 92–99. 18 Wilson M, Deane A. Using educational research to support change: a study of the learning climate within dental hygiene and dentistry undergraduate student clinics and laboratories. Probe 2003; 37: 254–260. 19 Miles S, Leinster SJ. Medical students’ perceptions of their educational environment: expected versus actual perceptions. Med Educ 2007; 41: 265–272. 20 Day I, Lin A. Quality assurance in postgraduate medical education: implications for dermatology residency training programs. J Cutan Med Surg 2012; 16: 5–10.
© 2015 European Academy of Dermatology and Venereology