Research Paper International Journal of

Pharmacy Practice International Journal of Pharmacy Practice 2016, 24, pp. 60–71

Participation in online continuing education Barbara Farrell1, Natalie Ward2, Brad Jennings3, Caitlin Jones4, Derek Jorgenson5, Ashley Gubbels-Smith6, Lisa Dolovich7 and Natalie Kennie8 Departments of 1Family Medicine and 2Sociology and Anthropology, University of Ottawa, Ottawa, 3Centre for Extended Learning, University of Waterloo, Waterloo, 4Department of Pharmacy, Kingston General Hospital, Kingston, 6Clark’s Pharmasave, Paris, 7Department of Family Medicine, McMaster University, Hamilton, 8Department of Family and Community Medicine, University of Toronto, Mississauga, ON and 5College of Pharmacy and Nutrition, University of Saskatchewan, Saskatoon, SK, Canada

Keywords continuous professional development; pharmacy education; online learning; participation Correspondence Dr Barbara Farrell, Department of Family Medicine, University of Ottawa, Bruyère Research Institute, 43 Bruyère St., Ottawa, ON, Canada K1N 5C8. E-mail: [email protected] Received February 8, 2013 Accepted May 26, 2015 doi: 10.1111/ijpp.12202

Abstract Objectives The ADAPT (ADapting pharmacists’ skills and Approaches to maximize Patients’ drug Therapy effectiveness) e-learning programme requires weekly participation in module activities and facilitated discussion to support skill uptake. In this study, we sought to describe the extent and pattern of, satisfaction with and factors affecting participation in the initial programme offering and reasons for withdrawal. Methods Mixed methods – convergent parallel approach. Participation was examined in qualitative data from discussion boards, assignments and action plans. Learner estimations of time commitment and action plan submission rates were calculated. Surveys (Likert scale and open-ended questions) included mid-point and final, exit and participation surveys. Key findings Eleven of 86 learners withdrew, most due to time constraints (eight completed an exit survey; seven said they would take ADAPT again). Thirty-five of 75 remaining learners completed a participation survey. Although 50–60% of the remaining 75 learners actively continued participating, only 15/35 respondents felt satisfied with their own participation. Learners spent 3–5 h/week (average) on module activities. Factors challenging participation included difficulty with technology, managing time and group work. Factors facilitating participation included willingness to learn (content of high interest) and supportive work environment. Being informed of programme time scheduling in advance was identified as a way to enhance participation. Conclusions This study determined extent of learner participation in an online pharmacist continuing education programme and identified factors influencing participation. Interactions between learners and the online interface, content and with other learners are important considerations for designing online education programmes. Recommendations for programme changes were incorporated following this evaluation to facilitate participation.

Introduction Online learning programmes for continuing pharmacist education are evolving. Many continue to employ a transmission model of learning[1] in which pharmacists passively assimilate knowledge by reading or viewing material delivered asynchronously or synchronously,[2] then demonstrate knowledge acquisition through the submission of test answers. Interactive components,such as question and answer chats,are sometimes included to stimulate social presence and discourse.[3] A International Journal of Pharmacy Practice 2016, 24, pp. 60--71

recent systematic review of e-learning programmes in pharmacy suggests that such learning is acceptable and initially increases knowledge but that studies are required to demonstrate improvement in skills, long-term knowledge retention and practice change.[4] Recognizing that incorporating new skills required for patient-centred, interprofessional care into practice requires active forms of learning and application, an online learning programme using a cognitive apprenticeship model was developed for pharmacists.[5] Such a model seeks to make © 2015 Royal Pharmaceutical Society

Barbara Farrell et al.

complex expert thinking visible to learners through the use of key concepts such as modeling, coaching, scaffolding, reflection, community building and situated learning contexts.[6,7] It is ‘designed, among other things, to bring these tacit processes into the open, where students can observe, enact, and practice them with help from the teacher’.[8] In an online context, this can be enabled through frequent dialogue (with oneself and others), achieved through the use of reflective, authentic activities and discussion boards.[7] The programme was designed with features of high-quality online learning such as (1) social interaction and collaboration with peers, (2) connecting new knowledge to past experience, (3) immediacy in application, and (4) a climate of selfreflection.[9] The intention was to promote an atmosphere of deep learning – one that is highly collaborative, integrative (synthesizing ideas and facts), self-reflective and application centred. The ‘ADapting pharmacists’ skills and Approaches to maximize Patients’ drug Therapy effectiveness’ (ADAPT) programme consisted of seven modules using subject matter experts to model aspects of the patient care process. Learners reflected upon these and engaged in guided practice activities using authentic tools and resources. Modules were designed to encourage learners to discuss activities and to exchange formative feedback with peers (in small groups) and moderators. Finally, learners reflected on their learning outcomes for each module and developed action plans for implementing new skills in practice. A full description of the development and implementation of the programme, module components and evaluation of learner satisfaction, attitude changes, perceived learning and initial practice changes are published elsewhere.[10,11] Long-term follow-up over a 1-year period following participation in the initial offering of the ADAPT programme demonstrated successful transfer-to-practice of patient care and collaboration skills taught in the programme.[12] The importance of learner participation has been emphasized in traditional education as Bloom theorized that learner participation is essential for active and engaged learning,[13,14] and Wenger argued that participation is an intrinsic part of learning.[15] Research in online learning has shown that participation, often measured as interaction with course content, peers and teachers, has a positive effect on learning[16,17] and influences learner satisfaction.[18] Hrastinski proposes that online participation actually drives online learning.[19] The pharmacy literature is beginning to describe factors that affect participation in online pharmacist continuing education. Buxton demonstrated that pharmacists perceived value, quality and relevance for online webinars but that attendance was low, often due to technical difficulties, scheduling conflicts or, as the authors hypothesized, due to the remote, transactional nature of distance learning.[20] Dalton’s description of an Australian online national pharmacy preceptor educaInternational Journal of Pharmacy Practice 2016, 24, pp. 60--71

61

tion programme illustrated that participants were challenged both by technology and lack of time.[21] At the time of this programme’s design, we conducted an environmental scan and literature review and could not find published studies discussing learner participation in pharmacy continuing education programmes using a cognitive apprenticeship framework. In this study, we sought to determine the extent and pattern of learner participation in the initial offering of the ADAPT programme, to understand factors influencing participation, satisfaction with participation and reasons for withdrawal from the programme. Understanding that participation in online learning is complex, that educators do not necessarily agree on a single definition and that it cannot be measured simply by quantifying messages posted or time logged into a learning system,[19] we used a mixed methods approach to understand participation.

Methods The ADAPT programme and participants ADAPT was a 15-week online education programme consisting of seven modules (Table 1), each conducted over 1–3 weeks. In the initial offering, five cohorts of 18 pharmacists and one moderator participated in activities according to a set schedule. More detailed descriptions of module development and goals are found in separate publications.[10,11] The ADAPT initial offering was open to all Canadian pharmacists. Those interested responded to advertisements circulated via email through the Canadian Pharmacists Association and the Canadian Society of Hospital Pharmacists and by LISTSERV postings on the Canadian Primary Care Pharmacists Specialty Network. To manage learner expectations and select those willing and able to devote time, all applicants completed an open-access self-assessment module advising approximately 4–5 h per week of ‘participation’ time, that high-speed Internet was necessary, private counselling space for patient interaction would be needed and to assess comfort level with online learning. All stated

Table 1

ADAPT modules

Module #

Title

1 2 3 4 5 6 7

Orientation Medication assessment Collaboration Interviewing patients Making decisions Documentation Putting it together © 2015 Royal Pharmaceutical Society

62

Participation in online continuing education

they were willing to put in extra time to improve skills. Although many indicated comfort with the online learning environment, experience varied and level of understanding of the type of online learning planned was unclear. A detailed description of the programme participant selection process, learner demographics and motivations for participating is published elsewhere.[22] All fees were waived. Throughout the programme, learners participated in individual and group activities and discussions. They began by introducing themselves in an orientation module, a task that enabled them to practice skills needed to participate within the Learning Management System, the University of Waterloo’s UW-ACE (Angel) system. In subsequent modules, they started by watching narrated lectures and videos, working through case vignettes, read assigned literature or reviewed recommended tools and resources. They then engaged in authentic learning by completing assignments and simulated activities at their practice site. They posted answers to assigned questions and reflections on learning on two to three asynchronous discussion boards per module, reviewed posts made by others in their group and provided feedback. Polls, discussion board prompts and action plan completion (at the end of each module) prompted learner reflection to identify current practice, engage with new skills and approaches and incorporate these into their practices.

Design Qualitative and quantitative methods in a convergent parallel approach[23] were used to assess participation. Detailed methods are published elsewhere.[11] Participation was assessed in the context of both individual and group activities. Data sources used to evaluate extent and pattern of participation, factors influencing participation, satisfaction with participation and reasons for withdrawal from the programme are outlined below. Analysis of time-tracking data, discussion boards, assignments, action plans, embedded and exit survey data was conducted in parallel, with the participation survey conducted thereafter. The study was approved by the Bruyère Continuing Care Research Ethics Board.

Table 2

Sample All 86 participants in the September (2010) to January (2011) offering of the ADAPT programme were invited and consented to participate in the evaluation.

Data collection Time-tracking (extent and pattern of participation) Following each module, learners estimated time spent completing it using an embedded survey with drop-down options. Discussion boards, assignments and action plans (extent and pattern of participation, factors influencing participation) All discussion board, assignment and action plan content was collected following the closing of the module. These data were sought to inform the extent to which learners were participating and the perceived quality of their participation. Action plan submission rates were calculated for each module. Mid-point and final surveys (extent and pattern of participation, factors influencing participation) Embedded surveys using open-ended and Likert-scale questions were administered to all participants at mid-point and within 1 month following programme completion. Most questions related to satisfaction with learning, perceived learning and practice change, but responses were included in the participation analysis if relevant.[11] Three additional questions related to previous experience with online learning were included in the final survey (Table 2). Participation survey (extent and pattern of participation, factors influencing participation, satisfaction with participation) A participation survey (Table 3) was developed following the analysis of the qualitative data and circulated to participants

Final survey questions related to understanding and opinions about online learning

1. During the orientation module, you had an opportunity to discuss your previous experiences with online learning. Now that you are about to complete the course, how different or similar do you think ADAPT is with your previous online learning experience? (5-point Likert scale ranging from very different to exactly the same). 2. Please indicate your opinion about whether the ADAPT online learning approach has been more or less effective than previous online continuing education programmes you have taken in helping you incorporate new skills into practice (5-point Likert scale ranging from much less effective to much more effective). 3. Please indicate your opinion about whether the ADAPT online learning approach has been more or less effective than other continuing education programmes you have taken in helping you incorporate new skills into practice (5-point Likert scale ranging from much less effective to much more effective).

© 2015 Royal Pharmaceutical Society

International Journal of Pharmacy Practice 2016, 24, pp. 60--71

Barbara Farrell et al.

Table 3

63

Participation survey questions

1. What factors affected your ability to participate actively in the ADAPT online learning modules? 2. What is the one thing you wished you had known about ADAPT before you started that would have positively affected your ability to participate? 3a) Participation varied in the different modules. Please indicate in which modules you feel you participated actively, in which modules you participated less and in which modules you feel you did not really participate: Participated actively

Participated less

Did not really participate

Orientation Module Module 2 Module 3 Module 4 Module 5 Module 6 Module 7 3b) We are interested to know why participation varied in the different modules so we can understand how to improve the programme. For each module please explain what precipitated your decision to participate actively, to participate less or even to not participate at all. Orientation module Module 2 Module 3 Module 4 Module 5 Module 6 Module 7 4a) Did you ever feel like quitting the ADAPT programme? Yes No 4b) Why or why not? 5a) How satisfied do you feel with your own participation in the pilot ADAPT programme? (5 point Likert scale from not satisfied to very satisfied) 5b) Comments 6a) Did the effort you put into completing the modules result in the level of knowledge and skill attainment that you were striving for? Yes No Partially 6b) Comments 7. Overall, what do you think is the single, most important thing you learned by participating in ADAPT?

who completed the programme. A combination of openended and Likert-scale questions were used. Those who completed the survey could enter a draw for a $100 bookstore gift certificate. Exit survey (reasons for programme withdrawal) An exit survey (Table 4) was sent to participants who had officially withdrawn. Survey response rates are summarized in Table 5.

Data analysis The research team conducted data analysis concurrently with data collection,[24] allowing them to remain immersed in the data.[24,25] This pattern of immersion/crystallization[25] provided the opportunity to better understand the programme and the experience of the participants. This technique was used throughout data analysis and was facilitated by weekly International Journal of Pharmacy Practice 2016, 24, pp. 60--71

Table 4

Exit survey questions

1. Please indicate your reasons for not completing the ADAPT programme. Choose as many as apply. ∘ Personal time constraints (e.g. not enough time to complete modules and/or assignments) ∘ High module workload (e.g. too many assignments, activities and/or surveys, modules are too long) ∘ Problems with accessing or navigating the e-learning programme ∘ Difficulties applying the course content into current practice ∘ Course content not relevant to my practice ∘ Personal reasons ∘ Other ____________________________________________ 2. Would you be interested in taking the ADAPT programme again in the future? ∘ Yes ∘ No ∘ Maybe Comments? ________________________________________

© 2015 Royal Pharmaceutical Society

64

Table 5

Participation in online continuing education

Survey response rates Mid-point Final Participation Exit survey survey survey survey

Number of respondents 8 Number of participants 11 Response rate 73%

35 77 45%

41 75 55%

35 75 47%

meetings between a subgroup of researchers and monthly meetings with the entire evaluation team. A standardized template was created, and particular attention was paid to data regarding participation and those factors identified as inhibiting or facilitating participation. A section at the end of the template allowed researchers to add further thoughts about the data. Each module activity was analyzed by two team members prior to discussion with the entire team, reducing the risk of bias. Completed templates were circulated to the entire team prior to monthly teleconferences. During these monthly meetings each team member described their findings verbally, verifying and/or disputing the analysis, and the team identified and discussed emerging themes. All teleconferences were audio-recorded, transcribed and used as further data. Descriptive statistics were used to analyze time-tracking data, action plan submission rate data and Likert-scale survey data. A deductive approach to the qualitative data was employed to understand factors affecting learner participation.[26] Written survey responses were grouped, synthesized and presented in relation to the questions they were associated with. Survey data were circulated with the above data for further analysis and discussion at monthly meetings. A small group of analysis team members considered the qualitative and quantitative data together using a parallel convergence model.[23] During several small team meetings, the data were reduced further through the creation of module specific matrices[27] and general themes. These were then discussed and confirmed at a large full team data analysis meeting along with additional results from the participation survey. The analysis process is illustrated in Figure 1.

Results Below we describe our findings as they relate to learner participation.

Extent and pattern of participation as modules progressed Eleven of the 86 participants officially withdrew. Of the remaining participants, self-reported perception of active participation (from the participation survey) began high, varied across the modules, waning over time and dipping to © 2015 Royal Pharmaceutical Society

its lowest point during module 5 (use of evidence-based medicine) and remaining relatively consistent at just less than 60% for the remaining modules (Table 6). Despite waning participation, the action plan submission rate remained fairly constant (55%) throughout the latter modules suggesting that those learners who felt they were actively participating were persisting through the completion of module activities. Discussion board participation varied and appeared to decline over time, though we were unable to assess the impact of group or individual participation on such variations. Learners reported spending an average of 3–5 h/week on individual modules, similar to the original estimation of 4–5 h/week. The range of time spent was broad, however, with some reporting having spent up to twice that amount of time.

Factors influencing participation in each module Factors that affected active participation in each module (from the participation survey question 3a and b, Table 3) are outlined in Figure 2. The relevance of useful content was highlighted as facilitating participation across all modules. As the modules progressed, lack of motivation hindered participation, as well as difficulty with the content in the evidence based medicine (Making Decisions) module and challenges in finding time to participate.

Factors affecting participation throughout Three themes illustrating key factors that hindered participation emerged from the qualitative analysis of surveys, discussion board, assignment and action plan data: technology, time and challenges with group work. Technology and interface characteristics Site navigation was perceived as difficult and frustrating, especially in the beginning: ‘I wasted a lot of time around technology’. Some participants experienced issues with Internet access or slowly loading videos (bandwidth), whereas others encountered issues trying to use an accompanying electronic medical record system (EMR). Many found that if they spent too long on a particular activity, they had to log in again to continue. Some found it difficult to navigate the site and to keep track of the different discussion boards used in each module, or indeed which activities they had completed, ‘it would be good if there was a way to show in the schedule which activities you have completed’. Some suggested a mechanism to print the overall schedule be incorporated. Time Nearly all participants stated they would have liked to have known more about the nature of the time commitment and International Journal of Pharmacy Practice 2016, 24, pp. 60--71

Barbara Farrell et al.

Figure 1

Table 6

65

Overview of data collection and analysis steps.

Learner-estimated time spent and action plan submission rates

Module length # Enrolled # of Time submissions Range* Median Mean # of submitted Action plans Submission rate

Orientation module

Module 2 – medication assessment

Module 3 – collaboration

Module 4 – interviewing patients

Module 5 – making decisions

Module 6 – documentation

Module 7 – putting it together

1 week 86 44 3–5 h 3 h, 30 min 3 h, 12 min N/a N/a

2 weeks 86 58 3–16 h 6 h, 30 min 7 h, 19 min 64 74%

2 weeks 84 34 5–16 h 9h 9 h, 58 min 48 57%

2 weeks 77 40 4–16 h 10 h 10 h, 21 min 45 58%

3 weeks 76 22 4–16 h 16 h 14 h, 24 min 38 50%

2 weeks 75 24 3–16 h 9h 9 h, 2 min 41 55%

1–4 weeks 75 13 5–16 h 10 h, 30 min 10 h, 20 min 41 55%

*16 h was highest option; many who chose ‘16’ indicated in open-ended comments that they had spent more than this time.

activity schedule beforehand. Though 25/41 who responded to the final embedded survey felt that the ADAPT online learning approach was more effective than other forms of online learning they had experienced, 85% (31/41) felt that International Journal of Pharmacy Practice 2016, 24, pp. 60--71

the approach using real-world activities and frequent discussion board activities was quite different from their previous online learning experience. Many thought that they could ‘do it all on the weekend’ and found that having activities and © 2015 Royal Pharmaceutical Society

66

Figure 2

Participation in online continuing education

Participants’ perception of ‘active participation’ and factors facilitating (white) and hindering (grey) participation in each module.

discussion board participation due multiple times throughout the week was challenging to manage. Those who took extended vacations had difficulty catching up. The fast pace, together with short timelines between activities, was difficult for working professionals to manage. It seemed personal commitments to work and family, as well as illness and unforeseen events contributed to the time challenge.

Work environment Participants stated that a supportive work environment and being given time for ADAPT activities enabled them to participate more easily. One person stated that their non-supportive work environment limited their ability to participate.

Group work Challenges with group work were identified as additional reasons for decreasing participation: ‘I was waiting for the partner that never came . . .’ Delayed or late postings hindered discussion as one participant noted: ‘I found it hard to complete the assigned discussions because others were not keeping up with the assigned schedule and found it frustrating to have completed an activity but then having to keep checking back to reply to others posts once other participants had completed the activity’. Sometimes, the online discussion failed to engage participants. As time went on, with fewer pharmacists participating actively in discussion boards, some noted difficulty in getting adequate feedback from others. Two themes emerged illustrating key factors that supported participation: willingness to learn and work environment. Willingness to learn The content was of high interest to participants. In some instances, learners struggled with material, but their recognition of the importance and value of the skills and information encouraged them to press on. This was particularly noticeable during the evidence-based medicine module. © 2015 Royal Pharmaceutical Society

Satisfaction with participation When questioned about satisfaction in the participation survey, only 15/35 respondents indicated that they were satisfied to very satisfied with their own participation. Even some who were satisfied said they wished they could have done more: ‘I feel bad that I didn’t participate fully in all modules because I knew that posting assignments and getting/giving feedback was a key component. In this way, I likely detracted from the learning experience of others’.[11] Overall, participants were more satisfied with their learning (80% or more with most modules as measured in the embedded surveys)[11] than with their own participation. Self-expressed concerns about their own participation centred on the inability to spend enough time to gain the most benefit and to contribute to others’ learning. Sixty-nine percent of those completing the participation survey said that they had thought about quitting, citing lack of time and feeling frustrated, overwhelmed and discouraged at times due to the deadlines:‘I could not devote the time I had promised. I was concerned about letting my teammates down and appearing to have little regard for the course developers. I stayed because of how gracious the leaders and classmates were’. International Journal of Pharmacy Practice 2016, 24, pp. 60--71

Barbara Farrell et al.

Reasons for programme withdrawal Those who completed the exit survey (8/11 of those who officially withdrew) stated their reasons for withdrawal were related to personal time constraints (5/8), problems accessing or navigating the site (3/8), personal reasons (2/8) and heavy module content (1/8): ‘Once I got behind, I could not catch up’. Most withdrawals occurred following the third module, about 5–6 weeks into the programme. Seven of the eight respondents indicated that they would take the programme again. No one indicated they had difficulty applying the course content to practice and all found the material that they completed to be beneficial: ‘I am sincerely sorry that I had to drop out of the ADAPT programme. . . . I have truly enjoyed the programme from day one. The programme has taught me a lot in improving my communication skills with other health care providers and the value of collaboration . . ., the course is of great value to all the participants and please keep up the good work’.

Discussion Participation in the ADAPT online learning programme began strongly, but waned over time, with some participants leaving the programme and about half of those remaining actively participating through completion of module activities. Time spent on module activities varied widely but averaged out as anticipated. Participation was positively influenced by high motivation and content relevance and negatively affected by more difficult content. Overall, constraints included challenges with technology, lack of time and issues with group work, with individuals’ willingness to learn and supportive work environments supporting participation. The strengths of this study are the trustworthiness and credibility demonstrated through the iterative analysis approach, which sought positive and negative examples of participation and confirmation through triangulation of qualitative and quantitative results from a variety of collected data. The literature on participation in pharmacy e-learning programmes has primarily relied on interviews and surveys to gather participant opinions.[20,21] This study incorporated additional textual materials to support self-report. Multiple researchers analyzed the data, both from individual modules and across the programme; agreement was sought and the interpretation of themes revisited during matrix development and repeated discussion. These elements of rigour support a reasonably accurate interpretation of the phenomenon of participation within the ADAPT programme. Conversely, the primary limitation of this study is the challenging nature of reporting and measuring ‘participation’ in online learning; others who study the concept agree that participation is a complex phenomenon.[15] The broad range of estiInternational Journal of Pharmacy Practice 2016, 24, pp. 60--71

67

mated time spent working on activities may be a result of different approaches that learners used to report their participation. Longer estimations may indicate time spent watching/reading/writing, as well as time spent reflecting on others’ discussion board posts or imagining how connections are made to their own practice.[28] However, the data do not provide a solid understanding of the differences between time spent doing practice activities and the internal dialogues that may have occurred. Those who expressed the need for printable schedules and the ability to check off completed activities may have included attempts at organizing their workload as part of their participation. Moreover, the small number of official drop-outs does not reflect the larger number of learners who stopped submitting action plans or participating in discussion boards over time; we have no understanding of whether these learners continued to participate in other ways. The initial 4–5 h/week time commitment estimation that was intended as direct interaction with programme activities did not consider the ongoing thinking and reflection that happens between activities, or the time required of participants to organize workload. This limitation speaks to the importance of outlining clear expectations and concepts of what is meant by ‘participation’ at the outset and the importance of incorporating strategies to facilitate self-regulation (such as organization and time management skills) in programme design and implementation. With regard to the data collection strategies employed, the reliance on self-reported data, low content validity of the participation survey (developed with content gleaned from qualitative analysis only) and little input from survey non-responders add to the limitations of the findings. ADAPT participation findings regarding constraints and supports are consistent with other studies illustrating associations with positive and negative attitudes of pharmacists toward continuing professional development in general.[29] Studies of online faculty development programmes have similarly identified factors affecting participation including perceived need or usefulness of the content, technical challenges and assistance, clear expectations and time to complete activities; however, definitions of participation and mechanisms for measurement varied.[30] The literature on participation in pharmacy e-learning programmes has also identified difficulty with technology and lack of time as key challenges though not quality of materials or participant satisfaction.[21,31] Xie (2011) demonstrated that a relationship between motivation and participation becomes stronger with time, as students start to perceive discussion activities as enjoyable and valuable.[32] The support of family and colleagues has also been shown to be important in online learning for professionals.[33] Our findings are consistent with these and add depth to the discussion of factors affecting participation, particularly in the context of the learners’ interactions with course components. © 2015 Royal Pharmaceutical Society

68

Table 7

Participation in online continuing education

Recommendations for changes to ADAPT to improve interaction and participation

Recommendations for changes to ADAPT to improve learner–interface interaction Prior to registration Add component about discussion board participation to self-assessment module Orientation module Lengthen to 2 weeks Add presentation about the learning approach Add additional orientation to the Learning Management System, including practice activities (e.g. using a dropbox, participating in a discussion board) Moderators Expand moderator role and training so they can encourage online participation successfully Overall Provide additional technical support early on Reorganize materials to make site navigation easier Remove embedded EMR Recommendations for changes to ADAPT to improve learner–content interaction Prior to registration Make overall course schedule available (had not been done for first iteration of the programme because content was being developed with a 2- to 3-week lead time to starting each module) Advise learners to plan participation during times not expecting to take vacation Scheduling Change pace and timing of activities to weekly due dates rather than every other day Discussion boards Reduce number of discussion boards (using the once weekly due date) Activities Reword specific activity instructions Change order and type of activities for the evidence-based medicine module along with adding length Moderators Ensure availability to provide responses to participant questions Recommendations for changes to ADAPT to improve learner-learner interaction Moderators Add moderator training component regarding; how to improve discussion board participation Discussion boards Develop a discussion board ‘self-assessment’ tool to help educate learners about their social roles within group activities and appropriate, effective use of discussion boards as well as to set parameters for varying levels of engagement

Distance learning interaction components have historically been described as learner–learner interactions, learner– instructor interactions, learner–content interactions and learner–interface interactions.[34,35] Such interactions are felt to be core indicators for deep learning because knowledge is constructed through the negotiation of meaning that occurs during interaction.[36,37] As others have,[38,39] we drew on Gilly Salmon’s five stage model of e-learning, which considers access and motivation, online socialization, information exchange, knowledge construction and development,[40] to subsequently guide recommendations regarding orientation module activities and moderator training to support participation.

Learner–interface ADAPT learners were initially challenged in trying to navigate the site. This appeared to be partly attributable to lack of © 2015 Royal Pharmaceutical Society

experience with this type of learning system. Most had likely been used to a transmission model of learning where a video or PDF with information is provided with no expectation for interaction other than completion of multiple choice questions. Difficulty using an embedded EMR, and for some, Internet access, speed and site log-on issues, contributed to technological challenges. Because the learner–interface can have a significant impact on quality and quantity of interactions,[41] related recommendations for programme improvement were made (Table 7).

Learner–content Programme content and activities did not negatively affect participation, except in the evidence-based medicine module, and were highlighted as major reasons for high satisfaction.[11] However, brisk pacing of activities and frequent International Journal of Pharmacy Practice 2016, 24, pp. 60--71

Barbara Farrell et al.

discussion board posting requirements were a major factor negatively influencing participation. These requirements are more common in a programme designed using a cognitive apprenticeship model than the more commonly experienced transmission model and likely came as surprise to those expecting the latter as evidenced by their initial responses that they were already familiar with online learning. At its lowest point, participation was most negatively affected when some learners felt they could not identify with the relevance of the content or found it too difficult to learn and apply to their practice setting, as we saw with the module on evidencebased medicine. Many struggled with this material, but because they recognized its importance, they soldiered on, eventually feeling very satisfied with their learning in this module. This highlights the importance of ensuring content is clearly relevant to learners and presented in a manner that enables them to actively keep up with the pace. Kirschner and Vonderwell describe the importance of assessing ‘cognitive load’ and monitoring learner participation patterns to enable instructional designers to design effective online learning.[42,43] Although brisk pacing is not new to online learning, the pace that had been set was overwhelming for some participants, and changes were recommended accordingly (Table 7).

Learner–learner Interaction among learners is an important facet of the ADAPT programme, used to facilitate reflection and selfassessment and to allow group members to work together to explore how new skills could be successfully incorporated into practice. Some learners indicated frustration with small group work for which they were either not able to connect with group members or for which others were tardy with postings. This is consistent with literature that argues participants active in online discussion experience frustration and communication anxiety due to delays and different participation rates of other members.[44] Recommendations to improve learner–learner interaction, particularly with regard to usual standards of discussion board conduct, were recommended (Table 7).

Implications Online programmes built on the cognitive apprenticeship framework are new for many pharmacists and represent a significant departure from older transmission models of distance learning. A cognitive apprenticeship model considers the importance of authentic materials and scaffolded tasks in helping learners acquire new cognitive processes that lead them on to continue towards more expert-like thinking. Inclusion of materials and tasks that are relevant to individual practices facilitates the potential for an online learning International Journal of Pharmacy Practice 2016, 24, pp. 60--71

69

programme to go beyond the goal of knowledge acquisition to support the ability to change behaviour in practice. However, in order to optimize participation, and potentially learning, pharmacist educators need to think carefully about the design of such programmes. The specific recommendations for this programme as a result of our analysis can be organized into a number of useful frameworks that educators could employ. We have illustrated this with the learner– interface, learner–content and learner–learner framework. Salmon’s model could also be used to frame a design that optimizes access and motivation, online socialization, information exchange and knowledge construction and development.[40] Finally, Lam’s model for designing online courses that maximize participation and learning is consistent with the factors that we found affected participation: (1) manage learner expectations up front by providing times and schedules, (2) orient learners to the online learning system and tasks by giving time for people to feel comfortable in a potentially new environment, (3) consider cognitive load when selecting content and pacing, and (4) use successful approaches to prompt learners to reflect and share experiences with their learning and practice.[45] The latter point, in particular, can be achieved by paying attention to the construction of discussion boards using multilevel and openended questions to stimulate meaningful discussion.[46] We believe that if educators consider these factors in the design phase, and in their evaluation of participation, they will be more successful in delivering online programmes that garner high participation. The connection between participation and learning outcomes needs to be explored further. Although we did not link participation data to individuals, one could surmise that those who participated more (by viewing presentations and videos, completing activities, submitting assignments and interacting in discussion boards) were more likely to have learned. This is consistent with assertions that online participation drives online learning.[19] However, there is an assumption that infrequent contributors are passive recipients of knowledge and not actively engaged in learning. However, watching, reading and writing are not the only important aspects of distance or computer-based learner participation. Holmberg (1989) argues that internal dialogues, such as thinking and reflection, are also key elements of distance education participation,[47] and this is consistent with how others have interpreted Kolb’s work on experiential learning that identifies abstract conceptualization (thinking) and reflective observation (understanding) as important learning modes that educators consider in designing virtual computer and online education programmes.[48,49] Participants who are ‘reading only’ may contribute little to discussions or submitting assignments but could be actively following and possibly learning.[50] Their low level of contribution can, however, negatively affect the learning © 2015 Royal Pharmaceutical Society

70

Participation in online continuing education

environment as others may see less value in contributing; we observed this with participants who felt their lack of active participation was negatively affecting the learning of others. Consideration must therefore be given to ensuring that programme participants understand their role and obligation in terms of participation to ensure the learning of the group as it moves together through the programme. Exploration of the relationship between participation and learning outcomes will assist educators in creating and delivering meaningful continuing education programmes.

Conclusion This study aimed to determine the extent to which learners participated in the initial offering of the ADAPT programme and to explore factors influencing their participation. Approximately 13% of participants officially withdrew, and between 50–60% of those remaining appeared to participate actively, though variably, through to the end of the programme. Factors that affected participation included: challenging interactions with the learning management system interface, satisfying but briskly paced and often timeconsuming interactions with content and variable interactions among learners. Recommendations for programme changes to maximize participation will be of value to other online education programme designers. Incorporating measurement strategies that allow one to understand the thinking and reflective components of participation, as well as the relationship of participation to learning outcomes would represent a meaningful contribution.

References 1. Rogoff B et al. Models of teaching and learning. In: Olson D, Torrance N, eds. The Handbook of Education and Human Development. Oxford: Blackwell, 1996; 388–414. 2. Buxton E. Pharmacists’ perception of synchronous versus asynchronous distance learning for continuing education programs. Am J Pharm Educ 2014; 78: Article 8. 3. Garrison DR et al. Critical inquiry in text based environment: computer conferencing in higher education. Internet High Educ 2000; 2: 87–105. 4. Salter S et al. Effectiveness of e-learning in pharmacy education. Am J Pharm Educ 2014; 78: Article 83. 5. Lave J, Wenger E. Situated Learning: Legitimate Peripheral Participation. Cambridge, UK: Cambridge University Press., 1991. © 2015 Royal Pharmaceutical Society

Declarations Conflict of interest The Author(s) declare(s) that they have no conflicts of interest to disclose.

Funding This work was supported by Health Canada under the Health Care Policy Contribution Programme in collaboration with the Canadian Pharmacists Association (CPhA). The evaluation budget was contracted from CPhA to the Bruyère Research Institute to ensure independent evaluation from the organization. Neither Health Canada nor CPhA were involved in the study design, collection, analysis or interpretation of the data, writing of the reports or decisions regarding manuscript submission.

Acknowledgements The authors wish to thank Pia Zeni Marks of the University of Waterloo for her contributions to the analysis and discussion of results.

Authors’ contributions All authors contributed to data analysis and initial drafting of the publication. Dr Farrell and Ms Ward wrote the publication; all authors reviewed and contributed to revisions. All Authors state that they had complete access to the study data that support the publication.

6. Collins A et al. Cognitive apprenticeship: making thinking visible. Amer Educ 1991; 6: 38–46. 7. Kopcha TJ, Alger C. Student teacher communication and performance during a clinical experience supported by a technology-enhanced cognitive apprenticeship. Comput Educ 2014; 72: 48–58. 8. Collins A et al. Cognitive Apprenticeship: Teaching the Craft of Reading, Writing, and Mathematics, Technical Report No. 403, Centre for the Study of Reading, University of Illinois: Cambridge, MA, 1987. 9. Cercone K. Characteristics of adult learners with implications for online learning design. AACE J 2008; 16: 137– 159. 10. Farrell B et al. Designing a novel continuing education program for pharmacists: lessons learned. Can Pharm J 2012; 145: e7–e16.

11. Farrell B et al. Evaluation of a pilot e-learning primary health care skills training program for pharmacists. Currents Pharm Teach Learn 2012; 5: 580–592. 12. Zeni Marks P et al. ‘I gained a skill and a change in attitude’: a case study describing how an online continuing professional education course for pharmacists supported achievement of its transfer-to-practice outcomes. Can J Univ Cont Educ 2014; 40: 1– 18. 13. Bloom B. Taxonomy of Educational Objectives. Boston, MA: Allyn and Bacon, Boston, 1984. 14. Pratton J, Hales LW. The effects of active participation on student learning. J Educ Res 1986; 79: 210–215. 15. Wenger E. Communities of Practice: Learning, Meaning and Identity. Cambridge, UK: Cambridge University Press, 1998.

International Journal of Pharmacy Practice 2016, 24, pp. 60--71

Barbara Farrell et al.

16. Fredericksen E et al. Student satisfaction and perceived learning with on-line courses: principles and examples from the SUNY learning network. J Asynchron Learn Net 2000; 4: 7–41. 17. Hiltz SR et al. Measuring the importance of collaborative learning for the effectiveness of ALN: a multi-measure, multi-method approach. J Asynchron Learn Net 2000; 4: 103–125. 18. Alavi M, Dufner D. Technologymediated collaborative learning: a research perspective. In: Hiltz SR, Goldman R, eds. Learning Together Online: Research on Asynchronous Learning Networks. Mahwah, NJ: Lawrence Erlbaum, 2005; 191–213. 19. Hrastinski A. A theory of online learning as online participation. Comput Educ 2009; 52: 78–82. 20. Buxton EC et al. Evaluation of a series of professional development webinars for pharmacists. Am J Pharm Educ 2012; 76: Article 155. 21. Dalton L et al. Evaluation of the national pharmacy preceptor education program. Austral J Rural Health 2007; 15: 159–165. 22. Jorgenson D et al. Characteristics of pharmacists who enrolled in the pilot adapt education program: implications for practice change. Can Pharm J 2012; 145: 260–263. 23. Creswell JW, Plano Clark VL. Designing and Conducting Mixed Methods Research. Thousand Oaks, CA: Sage Publications, Inc, 2007: 63–64. 24. Gifford S. Analysis of non-numerical research. In: Kerr C et al., eds. Handbook of Public Health Methods. Sydney, Australia: McGraw Hill Australia, 1998; 543–554. 25. Borkan J. Immersion/crystallization. In: Crabtree BF, ed. Doing Qualitative Research. Thousand Oaks, CA: Sage Publications, Inc, 1999; 179– 194.

71

26. Patton MQ. Qualitative Research and Evaluation Methods. Thousand Oaks, CA: Sage Publications., 2002. 27. Miles MB, Huberman AM. Qualitative Data Analysis: An Expanded Sourcebook. Thousand Oaks, CA: Sage Publications, Inc., 1994. 28. Thorpe M, Gordon J. Online learning in the workplace: a hybrid model of participation in networked, professional learning. Austral J Educ Technol 2012; 28: 1267–1282. 29. Power A et al. Factors affecting the views and attitudes of Scottish pharmacists to continuing professional development. J Pharm Pract 2011; 19: 424–430. 30. Cook DA, Steinert Y. Online learning for faculty development: a review of the literature. Med Teach 2013; 35: 930– 937. 31. Buxton E. Professional development webinars for pharmacists. Am J Pharm Educ 2012; 76: Article 155. 32. Xie K et al. Relationship between students’ motivation and their participation in asynchronous online discussions. JOLT 2011; 7: 17–29. 33. Haythornewaite M, Kazmer M. Bringing in the internet home: adult distance learners and their internet home and work worlds. In: Wellman B, Haythornthwaite C, eds. The Internet in Everyday Life. Maiden, MA: Blackwell, 2002. 431–463. 34. Hillman D et al. Learner-interface interaction in distance education: an extension of contemporary models and strategies for practitioners. Am J Distance Educ 1994; 8: 30–42. 35. Moore MG. Three types of interaction. Am J Distance Educ 1989; 3: 1–6. 36. Vygotsy L. Mind and Society. Cambridge, MA: Harvard University Press, 1978. 37. Ke F, Xie K. Toward deep learning for adult students in online courses. Internet High Educ 2009; 12: 136–145.

International Journal of Pharmacy Practice 2016, 24, pp. 60--71

38. Armellini A, Nie M. Open educational practices for curriculum enhancement. OpenLearn 2013; 28: 7–20. 39. Baker W, Watson J. Mastering the online master’s developing and delivering an online MA in English language teaching through a dialogic-based framework. Innov Educ Teach Int 2013; 51(5): 483–496. doi: 10.1080/14703297 .2013.796712. 40. Salmon G. E=moderating: The Key to Teaching and Learning Online, Vol. Second. London: Routledge, 2004. 41. Swan K. Issues of interface. Eur J Open Dist Learn 2004; 7: 1–5. 42. Vonderwell S, Zachariah S. Factors that influence participation in online learning. J Res Tech Educ 2005; 38: 213–230. 43. Kirschner PA. Cognitive load theory: implications of cognitive load theory on the design of learning. Learn Instr 2002; 12: 1–10. 44. Feenberg A. Computer conferencing and the humanities. Instr Sci 1987; 16: 169–186. 45. Lam W. Encouraging online participation. J Inf Syst Educ 2004; 15: 345–348. 46. Schwier RA, Seaton JX. A comparison of participation patterns in selected formal, non-formal and informal online learning environments. Can J Learn Technol 2013; 39(1): 1–15. 47. Holmberg B. Theory and Practice of Distance Education. London: Routledge, 1989. 48. Konak A et al. Using Kolb’s Experiential Learning Cycle to improve student learning in virtual computer laboratories. Comput Educ 2014; 72: 11– 22. 49. Richmond AS, Cummings R. Implementing Kolb’s learning styles into online distance education. Int J Instruct Technol Dist Learn 2005; 1: 45–54. 50. Williams B. Participation in online courses: how essential is it? Educ Tech Soc 2004; 7: 1–8.

© 2015 Royal Pharmaceutical Society

Participation in online continuing education.

The ADAPT (ADapting pharmacists' skills and Approaches to maximize Patients' drug Therapy effectiveness) e-learning programme requires weekly particip...
1MB Sizes 0 Downloads 7 Views