Journal of Evidence-Informed Social Work

ISSN: 2376-1407 (Print) 2376-1415 (Online) Journal homepage: http://www.tandfonline.com/loi/webs21

Practitioner Perspectives of Implementing Check & Connect Elizabeth Kjellstrand Hartwig & Brandy R. Maynard To cite this article: Elizabeth Kjellstrand Hartwig & Brandy R. Maynard (2015) Practitioner Perspectives of Implementing Check & Connect, Journal of Evidence-Informed Social Work, 12:4, 438-449, DOI: 10.1080/15433714.2013.873752 To link to this article: http://dx.doi.org/10.1080/15433714.2013.873752

Published online: 06 Mar 2015.

Submit your article to this journal

Article views: 39

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=webs21 Download by: [York University Libraries]

Date: 05 November 2015, At: 17:26

Journal of Evidence-Informed Social Work, 12:438–449, 2015 Copyright q Taylor & Francis Group, LLC ISSN: 2376-1407 print/2376-1415 online DOI: 10.1080/15433714.2013.873752

Practitioner Perspectives of Implementing Check & Connect Elizabeth Kjellstrand Hartwig

Downloaded by [York University Libraries] at 17:26 05 November 2015

Department of Counseling, Leadership, Adult Education, and School Psychology, Texas State University, San Marcos, Texas, USA

Brandy R. Maynard School of Social Work, Saint Louis University, St. Louis, Missouri, USA

While there is a growing reserve of evidence-based practices (EBPs) available to practitioners, much can be learned about how to implement EBPs in real-world settings. Evidence of the effects of a widely disseminated student engagement intervention, Check & Connect (C&C), is emerging yet little is known about the implementation of C&C in community-based settings. The purpose of the authors in this study was to examine practitioner attitudes and perspectives related to the C&C intervention and implementation to gain an understanding of core implementation components that facilitated or impeded implementation. A researcher-developed survey instrument was used to assess practitioner attitudes related to the C&C model and implementation among 14 school-based practitioners working in a dropout prevention program. Findings indicate that practitioners were highly positive about the C&C intervention and in their attitudes about implementing EBPs. Benefits of C&C identified by practitioners included increased relationship building with students, tracking students on a consistent and timely basis, and addressing attendance issues as a main focus of treatment. The most common implementation challenges were time constraints, paperwork, and targeting absentee students. These findings contribute to the emerging literature on C&C and the implementation of EBPs in schools and community-based settings. Keywords: Evidence-based practice, implementation, dropout prevention

Over the past decade, the use of evidence to inform practice has become a central tenet in health care, mental health, and education as a means of improving services and outcomes for children, families, and adults (Brodowski, Flanzer, Nolan, Shafer, & Kaye, 2007). Federal and state governments, along with private organizations, research institutes, and program developers have been leading efforts to disseminate and implement psychosocial interventions that have demonstrated efficacy and/or effectiveness (McHugh & Barlow, 2010). While evidence related to specific interventions, evidence-based practices (EBPs), have been mounting and databases of EBPs have been growing, there is little evidence that practitioners regularly use evidence to inform practice decisions and some evidence that practitioners continue to use empirically unsupported treatments (Pignotti & Thyer, 2009; Powell, Hausmann-Stabile, & McMillen, 2013). Moreover, when EBPs are implemented, they may not be implemented adequately and with fidelity, potentially affecting the intervention’s effectiveness and impact (Durlak & DuPre, 2008; Walrath, Sheehan, Holden, Hernandez, & Blau, 2006). Having evidence of effects of interventions is important, but it is not sufficient—implementation matters. Address correspondence to Elizabeth Kjellstrand Hartwig, Department of Counseling, Leadership, Adult Education, and School Psychology, Texas State University, 601 University Dr., San Marcos, TX, 78666. E-mail: [email protected]

438

Downloaded by [York University Libraries] at 17:26 05 November 2015

PRACTITIONER PERSPECTIVES OF CHECK AND CONNECT

439

While there is a growing reserve of EBPs available to agencies, relatively little is known about how to implement them in real-world settings, particularly in the human services (Powell, Proctor, & Glass, 2014; Proctor et al., 2007). Federal agencies, such as the National Institutes of Health, National Institute of Mental Health, Institute of Medicine, and the Institute of Education Sciences, have called for systematic efforts to better understand the factors that contribute to the successful implementation of evidence-based interventions. Efforts to learn more about implementation and alleviate the research-to-practice gap have resulted in an emerging discipline of implementation science. The growing body of implementation research has begun to shed light on the barriers and processes of EBP implementation (see Barwick et al., 2008; Fixsen, Blase, Naoom, & Wallace, 2005; Fixsen, Naoom, Blase, Friedman, & Wallace, 2005; Mitchell, 2011; Proctor et al., 2007; Proctor et al., 2009). Implementation components and factors associated with impeding or facilitating intervention implementation have been identified as related to the intervention agent, intervention, organization, and/or external environment (see Durlak & Dupre, 2008; Sanetti & Kratochwill, 2009). From a comprehensive review of implementation research, Fixsen and colleagues identified six core implementation components: practitioner selection, preservice and in service training, ongoing consultation and coaching, facilitative administrative support, practitioner and program evaluation, and systems interventions (Fixsen et al., 2005, p. 28). While implementation research is rapidly growing, implementation science is a relatively young field (Proctor et al., 2009); there remains much to be learned about research utilization and EBP implementation in mental health, and more particularly, in child and youth mental health (Estabrooks, Winther, & Derksen, 2004; Mitchell, 2011). Moreover, although there are many known implementation factors acknowledged in research, much of what we know about implementation processes and factors is derived from anecdotal evidence, case studies, or highly controlled experiments (Proctor et al., 2009), with limited attention to implementation in the intervention context of real-world settings (Noell, 2010; Sanetti & Kratochwill, 2009). One way to inform the adoption and implementation of EBPs is to examine practitioner perspectives and experiences of implementing an EBP in real-world settings. The implementation process can be viewed as “a series of choices and actions over time through which an individual or a system evaluates a new idea and decides whether or not to incorporate the innovation into ongoing practice” (Rogers, 2003, p. 168). Understanding practitioners’ experiences of the intervention and the implementation process is important to understanding and improving intervention implementation (Proctor et al., 2007). Studies that have sought to understand the perspectives of clinicians implementing EBPs have discovered insights that can inform future implementation efforts, such as identifying practice drift, successful implementation factors and strategies, and implementation challenges (Gustle, Hansson, Sundell, & Andree-Lo¨fholm, 2008; Kaye & Osteen, 2011; Moncher & Prinz, 1991; Resnick et al., 2005). While having evidence of effectiveness of an intervention is important in the clinical decision-making process, knowing whether the intervention is able to be implemented under real world conditions and whether practitioners respond well to the intervention is also vital to the decision-making process and successful application of the intervention. Intervening with At-Risk Students to Improve School Engagement Over the past two decades, the problem of low school engagement has received significant attention as low school engagement has been found to be strongly associated with poor school performance, dropout, and a myriad of risk behaviors, such as substance use, delinquency, truancy, and antisocial behaviors (Green et al., 2012; Henry, Knight, & Thornberry, 2012; Rumberger & Rotermund, 2012; Vaughn et al., 2011). Indeed, student engagement in school has become a vital factor in the conversation on dropout prevention and school completion (Christenson, Sinclair, Lehr, & Hurley, 2000; Doll & Hess, 2001). Identifying and exploring EBPs that can promote student engagement

Downloaded by [York University Libraries] at 17:26 05 November 2015

440

E. K. HARTWIG AND B. R. MAYNARD

and impact other problematic behaviors associated with engagement, such as achievement, truancy, and school dropout, is critical to promoting positive developmental outcomes for youth. One intervention designed to promote school engagement and reduce dropout is Check & Connect (C&C: Christenson, Sinclair, Thurlow, & Evelo, 1999; Sinclair, Christenson, Evelo, & Hurley, 1998). C&C is a school-based intervention designed to increase school engagement by targeting alterable behaviors, such as absences or disciplinary referrals that can lead to disengagement and school failure (Sinclair et al., 1998). The four components of C&C include: (1) a caring mentor or practitioner who promotes the importance of education with each student, (2) systematic monitoring of data (e.g., the “check”), (3) timely and individualized intervention that addresses the data and promotes school engagement (e.g., the “connect”), and (4) increasing communication between parents or caregivers and the school (Christianson et al., 2008). C&C uses a case management approach delivered by an adult “monitor” who works with students and their families for the duration of the intervention “to keep education a salient issue for the student, his or her family members, and teachers, and to reduce and prevent the occurrence of absenteeism, suspensions, failing grades and other warning signs of school withdrawal” (Sinclair et al., 1998, p. 10). C&C is often cited in the literature as a promising intervention for improving school engagement and reducing dropout. Evidence of positive effects of C&C on engagement, dropout, and related risk behaviors is emerging (Alvarez & Anderson-Ketchmark, 2010; Kelly, Raines, Stone, & Frey, 2010; Lehr, Johnson, Bremer, Cosio, & Thompson, 2004; Maynard, Kjellstrand, & Thompson, 2014; Stout & Christenson, 2009). While having evidence of effects of C&C is important, it is not sufficient; information about how interventions are experienced by those who are implementing them is important to understanding how and whether the intervention is implemented successfully, if at all. Little is known about the implementation of C&C in real world settings; to our knowledge, no study has specifically examined C&C implementation factors. To better inform the evidence on C&C, and contribute to the body of knowledge regarding implementation, it is important to examine practitioner’s perspectives and experiences of implementing C&C and explore factors that facilitate or impede the implementation of C&C. Purpose of the Authors in the Present Study The present study was part of a randomized effectiveness trial examining effects of C&C on attendance, behavior, and grades (Maynard et al., 2014). As part of the trial, C&C was implemented in 14 middle and high schools by a community-based organization, Communities In Schools (CIS). Fourteen CIS staff were trained and implemented C&C with a subset of students on their caseload. The purpose of the authors in the present study was to examine practitioner attitudes and perspectives toward C&C and to gain an understanding of core implementation components and organizational contexts that facilitated or impeded implementation to inform practice and contribute to the body of implementation research.

METHOD Participants Participants for this study were 14 site coordinators of CIS of San Antonio (CIS-SA) who were implementing C&C as part of a randomized effectiveness trial (Maynard et al., 2014). CIS site coordinators are school-based practitioners who work with school staff to identify students placed at-risk, assess student needs, and provide services to meet those needs so that students can focus on

PRACTITIONER PERSPECTIVES OF CHECK AND CONNECT

441

learning. All 14 participants (100%) were female, 36% earned a Bachelor’s degree and 64% earned a Master’s degree, and two of the practitioners (14%) had obtained their clinical licensure as a Licensed Professional Counselor. Participants had a mean of 4.04 years of post-degree experience and had worked at CIS-SA for an average of 4.38 years.

Downloaded by [York University Libraries] at 17:26 05 November 2015

Measures This study employed a mixed-methods survey design to address the question of EBP implementation at different levels and overcome the limitations of a single-design study. Participants completed a questionnaire that was comprised of researcher-developed questions related to participant demographics and professional experience as well as quantitative and openended questions addressing participants’ views and experiences related to the implementation of C&C. Demographics. A brief demographic questionnaire was used to gather participant information about length of time worked at CIS, highest degree obtained, and years of postdegree experience. Intervention implementation questionnaire. An intervention implementation questionnaire, comprised of 14 Likert scale and 12 open-ended questions, was used to examine participants’ views and experiences regarding aspects of the C&C intervention and implementation components. The questionnaire consisted of a series of questions using a five-point Likert scale with response options ranging from 1 (strongly agree) to 5 (strongly disagree). Additionally, open-ended questions, following specific Likert-scaled questions, were used to prompt participants to provide explanation or additional information regarding their responses to the Likert-scaled questions. Questions related to the intervention included items such as “I like the C&C aspect of the program”, followed by two open-ended questions: “What do you like about it?” and “What don’t you like about it?” Questions related to implementation included “I feel confident in my abilities to implement the program” and “I felt supported in implementing this intervention” followed by two open-ended questions asking “What support did you feel was helpful?” and “What support would you have liked to have had?” Procedures C&C was implemented by CIS site coordinators from October 2011 through May 2012. Site Coordinators implementing C&C participated in a one-day pre-service training that covered core components of the intervention. Practitioners employed the four components of C&C (Christianson et al., 2008), including serving as the caring practitioner who promoted the importance of education with each student, systematic monitoring of data (i.e., the “check”), meeting with the student on a weekly basis to address the data and promote school engagement (i.e., the “connect”), and increasing communication between parents or caregivers and the school through bimonthly parent or caregiver phone calls or letters relating to student progress in the program. A group component that was not a part of Christianson and colleagues’ traditional C&C implementation protocol was initially added to the C&C program during the pre-service training. The goal of the group component was to increase opportunities to connect with students by offering a biweekly group in addition to the weekly individual meeting. The group component was never fully implemented due to feedback from practitioners that student absenteeism and practitioner workload presented challenges to maintain this additional intervention element. Practitioners received weekly support through C&C informational e-mails from the program manager and meetings with their supervisor. Practitioners were required to turn in a student progress check form on a weekly basis to ensure fidelity to the C&C model. This form included the

Downloaded by [York University Libraries] at 17:26 05 November 2015

442

E. K. HARTWIG AND B. R. MAYNARD

list of students enrolled in the C&C program, measurable data indicators for grades, attendance, and behavior, notes regarding specific issues in attendance patterns or family concerns, and an intervention plan for the next week. This form was reviewed and tracked by the CIS program manager. The practitioner perspective questionnaires were administered in December 2011 during a booster training session to the 14 CIS site coordinators implementing C&C. Response rate was 100%. Numerical responses for the experience and intervention questionnaires were coded and descriptive statistics were conducted in SPSS v. 20 (IBM Corp., 2011). Responses for each open-ended question were transcribed into Microsoft Excel, coded based on number of times the same response was given, and organized into categories based on the content. Categories were refined as analysis proceeded. Lastly, categories were compared to identify themes based on the patterns observed across the participants and questions. The institutional review board of the university where this study originated approved the protocol for this study. Informed consent was obtained from all participants. Participants did not receive any compensation or incentive for completing the survey and their participation in the study was completely voluntary.

RESULTS Quantitative responses for the intervention implementation questionnaire, reported in Table 1, were grouped according to items related to the intervention and items related to implementation and listed by ascending means in each group. Qualitative responses from the questionnaire provide insight into the perspectives of practitioners who implemented the C&C intervention. Table 2 presents the open-ended questions and the most commonly occurring responses (i.e., four or more participants) from the intervention implementation questionnaire. Participant Perspectives of the Intervention As reported in Table 1, participants were largely in favor of C&C (M ¼ 1.57, SD ¼ 0.51). Participants reported that the intervention made sense for the students being targeted for the intervention (M ¼ 1.64, SD ¼ 0.63) and would help practitioners be more effective with absentee students (M ¼ 1.86, SD ¼ 0.66). Qualitative results from the intervention implementation questionnaire provided more depth to the quantitative items by expanding on the scored responses. Practitioners reported what they liked about the intervention: tracking student progress weekly, having weekly meetings with students, having more one-on-one time with students, and intervention helps develop a better relationship between practitioner and student. Three of these qualitative responses highlight the importance of building relationships with students through regular meetings. Practitioners reported the things they did not like about the C&C intervention: time constraints make implementation difficult, too much paperwork, and intervention was time consuming. Two of these responses relate to concerns with having enough time to implement the intervention, and the third most occurring response, too much paperwork, may further contribute to concerns about time constraints. Participant Perspectives of Implementation Responses to a series of questions related to implementation factors are also reported in Table 1. Practitioners indicated that they felt confident in their abilities to implement the program

PRACTITIONER PERSPECTIVES OF CHECK AND CONNECT

443

TABLE 1 Intervention Questionnaire Quantitative Items Means and Standard Deviations for Intervention Questionnaire Quantitative Items

Downloaded by [York University Libraries] at 17:26 05 November 2015

Intervention Implementation Questionnaire Items Intervention-related items I like the Check & Connect aspect of the program. I think this intervention makes sense for the students we are targeting (absentee students). I received adequate information about the research project. I think this intervention will help me be more effective with absentee students. I feel that this intervention fits well with my professional training. I received adequate training on the intervention. I think the results of Check & Connect will be better than what I would normally do with absentee students. I think the program will be more effective with both the group and the Check and Connect components together. I like the group aspect of the program. Implementation-related items It has been difficult to get packets back from the students/parents. I feel confident in my abilities to implement the program. My supervisor has been supportive in my implementing the intervention. Students with attendance problems are difficult to work with. There are issues in the school that have made it difficult to implement this intervention. I feel adequately prepared to implement the intervention. I felt supported in implementing the intervention. There are issues with CIS that have made it difficult to implement this intervention. I felt like I had adequate resources to implement the intervention. I felt like I had sufficient time to implement the intervention.

M

SD

1.57 1.64 1.79 1.86 1.86 1.93 2.14

0.51 0.63 0.89 0.66 0.53 0.92 0.95

3.21

1.19

3.43

1.55

1.57 1.57 2.07 2.07 2.14 2.21 2.36 2.43 2.93 3.29

1.09 0.65 1.14 1.00 1.10 1.05 1.22 1.22 1.44 1.14

Note. 1 ¼ strongly agree; 2 ¼ somewhat agree; 3 ¼ neither agree nor disagree; 4 ¼ somewhat disagree; 5 ¼ strongly disagree.

(M ¼ 1.57, SD ¼ 0.65), were sufficiently supported by their supervisor in the implementation process (M ¼ 2.07, SD ¼ 1.14), and felt adequately prepared to implement the intervention (M ¼ 2.21, SD ¼ 1.05). In qualitative responses, practitioners shared the pros of implementing the intervention: C&C improves relationships with students, C&C improves attendance, C&C promotes more one-on-one time with students, and practitioners can see student progress. Practitioners also shared that the support that was most helpful for implementing C&C was training and support from their supervisor or program manager. Practitioners also shared challenges to the implementation of the EBP. Quantitative responses indicated that getting consent packets back from students/parents (M ¼ 1.57, SD ¼ 1.09), issues in the school (M ¼ 2.14, SD ¼ 1.10), and issues with the agency (M ¼ 2.43, SD ¼ 1.22) made it difficult to implement C&C. Qualitative responses provided more support to these findings. Practitioners conveyed that getting consent packets returned and students rarely being in school presented initial challenges to practitioners. Student apathy, low parent engagement or concern, and targeting students for attendance, as opposed to academics or behavior, were challenges reported by practitioners for getting consent packets returned. Practitioners noted that incentives and constant reminders to students were helpful in getting consent packets returned. The main challenges for implementation indicated by practitioners were time constraints and too much paperwork. Additional agency-based challenges to implementing C&C reported by practitioners included having to maintain the same caseload in addition to implementing a new intervention and having other responsibilities beyond C&C.

444

E. K. HARTWIG AND B. R. MAYNARD TABLE 2 Qualitative Themes Based on Responses to Open-Ended Questions Qualitative Themes and Percentage and Number of Respondents

Downloaded by [York University Libraries] at 17:26 05 November 2015

Open-ended Questions and Responses Intervention-related questions What do you like about the intervention? Tracking student progress weekly was helpful. Having planned meetings with students on a weekly basis. Helps develop a better relationship between Site Coordinator and student. More one on one time with students. What do you dislike about the intervention? Time constraints make implementation difficult. Too much paperwork and too many reports. Time consuming. Implementation-related questions Pros for implementing intervention? C&C improves relationships with students. C&C improves attendance. Practitioners can see student progress. More one on one time with students. Cons for implementing intervention? Time constraints make implementation difficult. Too much paperwork. What helped to get consent packets back? Constant reminders to students. Incentives, such as snacks or candy. What was the biggest challenge in getting packets back? Students apathetic and won’t return paperwork. Low parent engagement, interest, or concern. Targeting students with attendance issues is difficult. What were challenges or barriers to implementation? Students won’t return packets. Students are rarely in school. What school- or agency-based issues have made implementation difficult? Implementing intervention in addition to typical caseload. Other job duties interfere with intervention implementation. What support was helpful? Training. Support from supervisor or program manager.

%

N

64.29% 28.57% 28.57% 28.57%

9 4 4 4

57.14% 35.71% 28.57%

8 5 4

50.00% 50.00% 35.71% 28.57%

7 7 5 4

57.14% 35.71%

8 5

28.57% 28.57%

4 4

42.86% 42.86% 28.57%

6 6 4

35.71% 28.57%

5 4

35.71% 35.71%

5 5

28.57% 28.57%

4 4

DISCUSSION The call for agencies to utilize EBPs has been evident in the literature for more than a decade, yet there are few studies that explore EBP implementation in real-world settings. The purpose of the authors in this study was to investigate practitioner attitudes about the implementation of an EBP: C&C. Through the present study the authors extend support for C&C and add to the expanding knowledge base of practitioner perspectives of EBP implementation. The findings summarized here contribute to the field of EBP implementation literature by presenting factors that may contribute to the successful implementation of C&C and exploring challenges to implementation in a school and community-based setting. Results from the intervention implementation questionnaire provide beneficial perspectives of implementing C&C from school and community-based practitioners. Both the quantitative and qualitative results from the intervention implementation questionnaire demonstrated overall

Downloaded by [York University Libraries] at 17:26 05 November 2015

PRACTITIONER PERSPECTIVES OF CHECK AND CONNECT

445

positive attitudes of the C&C intervention and implementation. Findings revealed that practitioners liked the C&C intervention, felt confident in their abilities to implement C&C, and believed that the intervention was a good match for working with absentee students. Practitioners reported that the weekly accountability and weekly tracking of measurable indicators of the C&C intervention were beneficial for both the practitioners and students and felt the C&C intervention improved relationships with students. Fixsen and colleagues (2005) investigated implementation framework for developing EBPs within organizations. They identified six core implementation components: practitioner selection, pre-service and in service training, ongoing consultation and coaching, facilitative administrative support, practitioner and program evaluation, and systems interventions (Fixsen et al., 2005, p. 28). Winter and Szulanski (2001) assert that when the core components of an intervention are clearly defined, the organization can more readily implement the intervention successfully. Fixsen and colleagues (2005) extend this by noting that well planned and executed implementation strategies can improve services at the practitioner, organizational, and national levels. While core implementation components are a vital piece in successful implementation, organizational components and external influences also play a key role in multilevel dynamics of successful implementation. Fixsen and colleagues (2005) research provides a framework for discussing the results of this study. The first two core implementation components, practitioner selection and pre-service and in service training, are strategies that occur before implementation begins. Practitioner selection is defined as the identification of who is qualified to carry out the EBP and methods for recruiting and selecting practitioners (Fixsen et al., 2005). While the instruments used did not elicit information about various practitioner selection strategies, CIS requires all staff to have a Bachelor’s degree in a social service field and some experience working with youth. More information about practitioner selection could have been helpful in this study. Pre-service and in service training—Fixen and colleagues (2005) second core implementation component—is comprised of introducing the intervention, practicing new skills, and receiving feedback in a safe environment. For this study, CIS required all site coordinators who were going to implement the C&C program to participate in a pre-service training. Results indicated that practitioners felt adequately prepared to implement the intervention (M ¼ 2.21, SD ¼ 1.05) and confident in their abilities to implement the program (M ¼ 1.57, SD ¼ 0.65). Qualitative responses maintained this by noting that the C&C pre-service training was one of the most helpful vehicles of support. The next two core implementation components by Fixsen and colleagues (2005) are ongoing consultation and coaching and facilitative administrative support. Ongoing consultation and coaching is described as support from a supervisor throughout the implementation of the EBP. CIS provided ongoing consultation and coaching by having the direct supervisors of the site coordinators meet or talk with C&C staff on a weekly basis to encourage the staff in implementing the program and addressing any concerns that arise. Facilitative administrative support is defined as leadership of the EBP implementation and keeping staff focused on desired clinical outcomes. CIS provided administrative support through the pre-service training facilitated by the program manager and weekly e-mails that kept staff focused on C&C goals and desired outcomes. Results indicated that practitioners believed they were sufficiently supported by their supervisor in the implementation process (M ¼ 2.07, SD ¼ 1.14). Qualitative responses by practitioners identified encouragement from the supervisor and support from the program manager as two of the three most helpful means of support. These resources of support are also acknowledged in the literature by Proctor and colleagues (2007), who identified that agency leadership and additional support of practitioners was beneficial in practice implementation. Fixsen and colleagues (2005) fifth core implementation component, practitioner and program evaluation, is described as measures of fidelity to assess practitioner performance and feedback from practitioners. The student progress check, described in the Methods section, was used as a

Downloaded by [York University Libraries] at 17:26 05 November 2015

446

E. K. HARTWIG AND B. R. MAYNARD

fidelity measure to ensure that practitioner were completing the “check” data on a weekly basis and planning a “connect” for the upcoming week. Practitioners sent this form on a weekly basis to the CIS program manager, who reviewed and tracked this information. While this measure of fidelity did not measure the quality of the C&C services, it did provide weekly tracking of measurable data, which is a key element to the C&C model (Sinclair, Christenson, Lehr, & Anderson, 2003). Feedback was obtained from practitioners during the December 2011 booster training session and from the questionnaires used in the study. Practitioners also gave verbal and written feedback to their supervisors and the program manager throughout the implementation process. Results from the questionnaires indicated that practitioners were in favor of the C&C intervention and that the intervention would help practitioners be more effective with absentee students. Practitioners identified several benefits to the C&C intervention: increased opportunities for relationship building with students, observable progress due to weekly tracking of measurable indicators, and weekly accountability for the students. Practitioners also identified several challenges during the implementation process. Student/parent-related challenges included: students absent too often to return packets, student apathy in returning the paperwork, low parent engagement/interest in their child’s participation in C&C, and getting students to come to the group. As a result of practitioner feedback during the first month of C&C implementation, the group component, which is not a part of the traditional C&C model, was removed from implementation service requirements. The sixth core implementation component, systems interventions, includes strategies to work with internal and external systems to support the work of the practitioners. Organizational context and external influences, such as funding, workload requirements, school staff involvement, and management support, are key ingredients in this component. School- or agency-related challenges, including time constraints, paperwork, and having to rely on others (e.g., attendance clerk, teachers) to get necessary information for the intervention, were main challenges expressed in qualitative responses. Due to state agency funding requirements, practitioners were expected to complete all of the service and paperwork requirements of the EBP in addition to maintaining the same caseload as other practitioners who were not implementing the intervention. The challenges identified by practitioners are further supported in the literature that suggests that time constraints, large caseloads, and limitation in provider capacity are barriers to successful EBP implementation (Barwick et al., 2008; Davies, Spears, and Pugh, 2004; Proctor et al., 2007). Both quantitative and qualitative results from this study suggest that having additional EBP responsibilities without the reduction of caseload or other responsibilities caused noteworthy challenges for practitioners. Despite these challenges, practitioners were able to implement the C&C intervention and the intervention yielded positive student outcomes (Maynard et al., 2014). Overall, study findings suggest positive attitudes regarding the benefits of the C&C intervention, such as helpful components of the C&C intervention, increased relationship-building with students, practitioner confidence in their ability to implement the EBP, and agency support of intervention implementation. The intervention questionnaire also revealed challenges to implementation, which were associated primarily with agency factors, such as time constraints, paperwork requirements, and limitations in provider capacity. These findings provide more evidence to the body of research on EBP implementation. Limitations Through this study the authors presented the perspectives of a small number of practitioners, whose views are important but not sufficient for the understanding of EBP implementation. Prior research on qualitative interviews suggests that meta-themes can be recognized from as few as six respondents (Guest, Bunce, & Johnson, 2006). Results of this mixed-methods study are intended to inform further research rather than generalize to the broader population of school- and communitybased practitioners. The constructs of self-report data and social desirability may have caused

PRACTITIONER PERSPECTIVES OF CHECK AND CONNECT

447

participants to express more positive views of C&C and the implementation components. While these limitations exist, the study offers a unique perspective of practitioner attitudes of implementing C&C and provides insight into benefits and challenges in community-based settings.

Downloaded by [York University Libraries] at 17:26 05 November 2015

CONCLUSIONS With the calls for schools and community-based organizations to adopt and implement EBPs to improve child outcomes, and the ever-increasing array of interventions that have some evidence to support their effectiveness, it is becoming more vital to examine and understand factors that contribute to successful EBP implementation within school and community-based contexts. Understanding the factors that facilitate and challenge successful EBP implementation can embolden agencies, program directors, and practitioners to consider implementation factors when selecting interventions and develop and build supports into the implementation process to ensure successful implementation. In this study the authors examined practitioner perspectives of implementing an EBP, C&C, in a community-based dropout prevention program implemented in a school setting. The findings suggest that utilizing C&C can promote perceived benefits to clients and staff through increased relationship-building, observing client progress, and utilizing an intervention that is a good match for targeted students. Similar to findings by Proctor et al. (2007), through this study the authors also revealed that EBP implementation is not just about practitioner perspectives, but also about strategic organizational support. The challenges identified in this study indicate that time constraints, paperwork, and caseload size should be considered in the planning stages of EBP implementation. Future research in the field of EBP implementation can extend knowledge of the factors that contribute to successful implementation of C&C. Additional research with practitioners could include a larger sample size that can provide data that is more generalizable to school practitioners and mental health providers and researchers. Researchers may also choose to employ qualitative and quantitative measures at different stages of C&C implementation, such as pre-implementation, during implementation, and post-implementation, to track changing perspectives of the intervention and EBP implementation. Assessing perspectives of agency directors and clients can also provide richer findings about the benefits and challenges of the C&C intervention and implementation strategies. Further examination of C&C implementation can help school and community-based organizations and practitioners explore how this evidence-based intervention can be implemented effectively in real world settings.

ACKNOWLEDGMENTS The authors would like to thank the local Communities In Schools (CIS) affiliate for their commitment to building and using evidence to improve practice, the CIS site coordinators for their hard work throughout the implementation and data collection process, participating schools for their support and commitment to their students, and the students who participated in this study.

FUNDING The authors are grateful for support from the Meadows Center for Preventing Educational Risk at the University of Texas at Austin and the Institute of Educational Sciences (grant #R324B080008).

448

E. K. HARTWIG AND B. R. MAYNARD

The content is solely the responsibility of the authors and does not necessarily represent the official views of the supporting entities.

Downloaded by [York University Libraries] at 17:26 05 November 2015

REFERENCES Alvarez, M. E., & Anderson-Ketchmark, C. (2010). Review of an evidence-based school social work intervention: Check & Connect. Children & Schools, 32, 125–127. Barwick, M. A., Boydell, K. M., Stasiulis, E., Ferguson, H., Blase, K., & Fixsen, D. (2008). Research utilization among children’s mental health providers. Implementation Science, 3, doi:10.1186/1748-5908-3-19 Brodowski, M. L., Flanzer, S., Nolan, C., Shafer, J., & Kaye, E. (2007). Children’s bureau discretionary grants: Knowledge development through our research and demonstration projects. Journal of Evidence-Based Social Work, 4, 3–20. Christenson, S. L., Thurlow, M. L., Sinclair, M. E., Lehr, C. A., Kaibel, C. M., Reschly, A. L., . . . Pohl, A. (2008). Check & Connect: A comprehensive student engagement intervention manual. Minneapolis, MN: University of Minnesota, Institute on Community Integration. Christenson, S. L., Sinclair, M. F., Lehr, C. A., & Hurley, C. M. (2000). Promoting successful school completion. In D. Minke & G. Bear (Eds.), Preventing school problems-promoting school success: Strategies and programs that work. Bethesda, MD: National Association of School Psychologists. Christenson, S. L., Sinclair, M. F., Thurlow, M. L., & Evelo, D. (1999). Promoting student engagement with school using the Check & Connect model. Australian Journal of Guidance and Counseling, 9, 169–183. Davies, M., Spears, W., & Pugh, J. (2004). What VA providers really think about clinical practice guidelines. Federal Practitioner, 21, 15–30. Doll, B., & Hess, R. S. (2001). Through a new lens: Contemporary psychological perspectives on school completion and dropping out of school. School Psychology Quarterly, 16, 351–356. Durlak, J. A., & Dupre, E. P. (2008). Implementation matters: A review on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327–350. doi:10. 1007/s10464-008-9165-0 Estabrooks, C. A., Winther, C., & Derksen, L. (2004). Mapping the field: A bibliometric analysis of the research utilization literature in nursing. Nursing Research, 53, 293 –303. Fixsen, D. L., Blase, K. A., Naoom, S. F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19, 531 –540. Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). Green, J., Liem, G. D., Martin, A. J., Colmar, S., Marsh, H. W., & McInerney, D. (2012). Academic motivation, self-concept, engagement, and performance in high school: Key processes from a longitudinal perspective. Journal of Adolescence, 35, 1111–1122. doi:10.1016/j.adolescence.2012.02.016 Guest, G., Bunce, A., & Johnson, L. (2006). How many interviews are enough? An experiment with data saturation and variability. Field Methods, 18, 59– 82. Gustle, L., Hansson, K., Sundell, K., & Andree-Lo¨fholm, C. (2008). Implementation of evidence-based models in social work practice: Practitioners’ perspectives on an MST trial in Sweden. Journal of Child & Adolescent Substance Abuse, 17, 111 –125. Henry, K., Knight, K., & Thornberry, T. (2012). School disengagement as a predictor of dropout, delinquency, and problem substance use during adolescence and early adulthood. Journal of Youth & Adolescence, 41, 156 –166. doi:10.1007/ s10964-011-9665-3 IBM Corp. (2011). IBM SPSS Statistics for Windows, Version 20.0. Armonk, NY: IBM Corp. Kaye, S., & Osteen, P. J. (2011). Developing and validating measures for child welfare agencies to self-monitor fidelity to a child safety intervention. Children and Youth Services Review, 33, 2146–2151. doi:10.1016/j.childyouth.2011.06.020 Kelly, M. S., Raines, J. C., Stone, S., & Frey, A. (2010). School social work: An evidence-informed framework for practice. New York, NY: Oxford University Press. Lehr, C. A., Johnson, D. R., Bremer, C. D., Cosio, A., & Thompson, M. (2004). Essential tools: Increasing rates of school completion: Moving from policy and research to practice. A manual for policymakers, administrators, and educators. Minneapolis, MN: National Center on Secondary Education and Transition. Maynard, B. R., Kjellstrand, E. K., & Thompson, A. (2014). Effects of check and connect on attendance, behavior, and academics. A randomized effectiveness trial. Research On Social Work Practice, 24(3), 296– 309. doi:10.1177/10497 31513497804

Downloaded by [York University Libraries] at 17:26 05 November 2015

PRACTITIONER PERSPECTIVES OF CHECK AND CONNECT

449

McHugh, R. K., & Barlow, D. H. (2010). The dissemination and implementation of evidence-based psychological treatments: A review of current efforts. American Psychologist, 65, 73– 84. Mitchell, P. F. (2011). Evidence-based practice in real-world services for young people with complex needs: New opportunities suggested by recent implementation science. Children and Youth Services Review, 33, 207–216. Moncher, F. J., & Prinz, R. J. (1991). Treatment fidelity in outcome studies. Clinical Psychology Review, 11, 247 –266. doi:10.1016/0272-7358(91)90103-2 Noell, G. H. (2010). Empirical and pragmatic issues in assessing and supporting intervention implementation in school. In G. G. Peacock, R. A. Ervin, E. J. Daly, & K. W. Merrell (Eds.), Practical Handbook in School Psychology (pp. 513– 530). New York, NY: Guilford Publications. Pignotti, M., & Thyer, B. A. (2009). Use of novel unsupported and empirically supported therapies by licensed clinical social workers: An exploratory study. Social Work, 33, 5–17. Powell, B. J., Hausmann-Stabile, C., & McMillen, J. (2013). Mental health clinicians’ experiences of implementing evidence-based treatments. Journal of Evidence-Based Social Work, 10, 396–409. doi:10.1080/15433714.2012.664062 Powell, B. J., Proctor, E. K., & Glass, J. E. (2014). A systematic review of strategies for implementing empirically supported mental health interventions. Research On Social Work Practice, 24(2), 192 –212. Proctor, E. K., Knudsen, K. J., Fedoravicius, N., Hovmand, P., Rosen, A., & Perron, B. (2007). Implementation of evidencebased practice in community behavioral health: Agency director perspectives. Administration and Policy in Mental Health and Mental Health Services Research, 34, 479 –488. doi:10.1007/s10488-007-0129-8 Proctor, E. K., Landsverk, J., Aarons, G. A., Chambers, D. A., Glisson, C., & Mittman, B. S. (2009). Implementation research in mental health services: An emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Services Research, 36, 24–34. doi:10.1007/s10488-008-0197-4 Resnick, B., Bellg, A. J., Borrelli, B., DeFrancesco, C., Breger, R., Hecht, J., . . . Czajkowski, S. (2005). Examples of implementation and evaluation of treatment fidelity in the BCC studies: Where we are and where we need to go. Annals of Behavioral Medicine, 29, 46–54. doi:10.1207/s15324796abm2902s_8 Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York, NY: Free Press. Rumberger, R. W., & Rotermund, S. (2012). The relationship between engagement and high school dropout. In S. L. Christenson, A. L. Reschly, & C. Wyli (Eds.), Handbook of research on student engagement (pp. 491 –513). New York, NY: Springer Science þ Business Media. doi:10.1007/978-1-4614-2018-7_24 Sanetti, L. R., & Kratochwill, L. M. (2009). Toward developing a science of treatment integrity: Introduction to the special series. School Psychology Review, 38, 445 –459. Sinclair, M. F., Christenson, S. L., Evelo, D. L., & Hurley, C. M. (1998). Dropout prevention for youth with disabilities: Efficacy of a sustained school engagement procedure. Exceptional Children, 65, 7–22. Sinclair, M. F., Christenson, S. L., Lehr, C. A., & Anderson, A. (2003). Facilitating student engagement: Lessons learned from Check & Connect longitudinal studies. California School Psychologist, 8, 29 –41. Stout, K. E., & Christenson, S. L. (2009). Staying on track for high school graduation: Promoting student engagement. Prevention Researcher, 16, 17–20. (in press). A systematic review of implementation strategies in mental health service settings. Research on Social Work Practice. Vaughn, M. G., Wexler, J., Beaver, K. M., Perron, B. E., Roberts, G., & Fu, Q. (2011). Psychiatric correlates of behavioral indicators of school disengagement in the United States. Psychiatric Quarterly, 82, 191–206. doi:10.1007/s11126-0109160-0 Walrath, C. M., Sheehan, A. K., Holden, E. W., Hernandez, M., & Blau, G. M. (2006). Evidence-based treatments in the field: A brief report on provider knowledge, implementation, and practice. Journal of Behavioral Health Services & Research, 33, 244 –253. Winter, S. G., & Szulanski, G. (2001). Replication as strategy. Organization Science, 12, 730 –743.

Practitioner perspectives of implementing check & connect.

While there is a growing reserve of evidence-based practices (EBPs) available to practitioners, much can be learned about how to implement EBPs in rea...
316KB Sizes 0 Downloads 5 Views