Journal of Clinical Child & Adolescent Psychology, 43(2), 169–178, 2014 Copyright # Taylor & Francis Group, LLC ISSN: 1537-4416 print=1537-4424 online DOI: 10.1080/15374416.2013.848772

Moving Science Into State Child and Adolescent Mental Health Systems: Illinois’ Evidence-Informed Practice Initiative Amy C. Starin and Marc S. Atkins Institute for Juvenile Research, University of Illinois Chicago

Kathryn C. Wehrmann School of Social Work, Illinois State University

Tara Mehta Institute for Juvenile Research, University of Illinois Chicago

Matthew S. Hesson-McInnis Department of Psychology, Illinois State University

A. Marinez-Lora Institute for Juvenile Research, University of Illinois Chicago

Renee Mehlinger Illinois Division of Mental Health

In 2005, the Illinois State Mental Health Authority embarked on an initiative to close the gap between research and practice in the children’s mental health system. A stakeholder advisory council developed a plan to advance evidence informed practice through policy and program initiatives. A multilevel approach was developed to achieve this objective, which included policy change, stakeholder education, and clinician training. This article focuses on the evidence-informed training process designed following review of implementation research. The training involved in-person didactic sessions and twicemonthly telephone supervision across 6 cohorts of community based clinicians, each receiving 12 months of training. Training content initially included cognitive behavioral therapy and behavioral parent training and was adapted over the years to a practice model based on common element concepts. Evaluation based on provider and parent report indicated children treated by training clinicians generally showed superior outcomes versus both a treatment-as-usual comparison group for Cohorts 1 to 4 and the statewide child population as a whole after 90 days of care for Cohorts 5 to 6. The results indicated primarily moderate to strong effects for the evidence-based training groups. Moving a large public statewide child mental health system toward more effective services is a complex and lengthy process. These results indicate training of community mental health providers in Illinois in evidence-informed practice was moderately successful in positively impacting child-level functional outcomes. These findings also influenced

We gratefully acknowledge the contribution of members of the Evidence Based Advisory Council, social service agencies, and staff participating in the project, and the generosity of the children and families served. Correspondence should be addressed to Amy Starin, Institute for Juvenile Research, University of Illinois Chicago, 1747 West Roosevelt Road, MC 747, Chicago, IL 60608. E-mail: [email protected]

170

STARIN ET AL.

state policy in committing resources to continuing the initiative, even in difficult economic times.

Introducing innovation in clinical practice within a large, diverse state mental health system is a complicated and complex process. An emerging science of dissemination and implementation in community settings suggests that bridging the research-to-practice gap requires adaptation and change on multiple levels: individual (e.g., provider attitude; Aarons, 2004), organizational (e.g., social context; Glisson & James, 2002) and financial (e.g., reimbursement; Raghavan, Bright, & Shadoin, 2008). This has led to a highly individualized and myriad of state and federal initiatives. For example, as reviewed by Aarons, Hurlburt, and Horwitz (2011), California implemented Proposition 63 to fund new implementation and sustaining of evidence-based practices; New York State created a technical assistance center to support ongoing training and consultation; and Ohio created a center to support best practices, including evidencebased practices. In 2005, the Illinois Division of Mental Health (DMH), the State Mental Health Authority, made the commitment to embark on an effort to narrow the research to practice gap and to bring research supported care into the community-based children’s mental health system in Illinois. The DMH leadership made this decision, and the project was assigned to the lead author. In Illinois, the state children’s mental health system provides services for approximately 40,000 youth annually ranging from birth through age 17. These youths are diagnosed with mental health conditions ranging from adjustment disorders to complex psychoses. The state is also remarkable for its diversity in respect to race, ethnicity, and geography; Illinois houses several urban areas that are ringed with suburbs, with large rural tracks between them. The state does not provide services directly; rather it contracts with approximately 125 community mental health centers to serve children and families throughout the state. Making changes in a system of this size involves a multicomponent effort to adequately train providers, amend state policy to finance and support new programs, attend to relevant research, engage colleagues in higher education, and promote quality assurance. This article touches briefly on each of these areas and then focuses on the development and implementation of the provider-training component of the Evidence Based Training Initiative in Illinois (EBTI). THE EVIDENCE-BASED PRACTICE ADVISORY COUNCIL The first step in the Initiative was to develop an advisory council of committed stakeholders; the Evidence-Based

Practice Advisory Council. Recognizing the vast task ahead in moving science into a statewide system, it was clear this council would need to include representatives from all levels of the child mental health system as well as external stakeholders and consumers. Therefore, it included four consumer parents, five clinical or executive directors from community mental health agencies (including the Council co-chair); representatives from advocacy groups including the community mental health trade association, the Illinois Children’s Mental Health Partnership, and the Illinois Federation of Families; sister state agency representatives from the Division of Alcohol and Substance Abuse and child welfare; faculty from five Illinois universities; and three DMH staff, including the first author as co-chair. Because this was a statewide initiative, advisory council members came from all areas of Illinois, requiring council meetings to be held entirely via conference call; over the 7-year history of the Council, there has never been an in-person meeting. The initial action step the Council took was to briefly review research literature on the concept of evidencebased practice in community settings (e.g., Gellis & Reid, 2004; Gibbs & Gambrill, 2002; Shlonsky & Gibbs, 2004). This literature discussed the history of empirically based practice, largely in the social work field. This review prompted an initial set of discussions debating which approach to adopt in moving science into the system. The choice was between a focus on specific manualized practices or focusing on the evidence-based process (i.e., using data to inform clinical decisions). At this time the Council implemented a survey of direct providers. The intent of the survey was to elicit stakeholder input from providers regarding their awareness, interest, and implementation of evidence-base practices. Provider Survey This 32-item paper survey was mailed to all 126 participating community mental health agencies providing child and adolescent services requesting distribution to their child serving clinicians. Surveys were returned anonymously by mail to DMH. Of the approximately 1,000 child mental health providers in the state, 303 clinicians responded with completed surveys. Surveys were returned in equal distribution between providers in urban, suburban, and rural areas of Illinois. Most (97%) reported they were interested in learning about evidence-based practice but felt that their agencies were not well equipped to implement evidence-based practices or to translate research into practice. The providers reported that they were working with children with both

EVIDENCE-INFORMED PRACTICE

internalizing and externalizing symptoms relatively equally. They specified that they were interested in learning evidence-based approaches to address these problems. The providers reported a minimal amount of prior training on manualized evidence-based practices, and these training methods were likely insufficient to change practice behavior, such as one time seminars (Davis et al., 1999). Evidence-Based Practice: Noun or Verb? Although the Council was aware that this anonymous survey did not reflect the entire population of providers, it did provide important stakeholder input that informed the development of a comprehensive strategy to move the concepts of science into the child system. There was rich discussion of the diversity of the Illinois population in terms of geography, race, and ethnicity and the lack of consideration of these issues for most manualized evidence-based practices at that point in time. In addition, DMH was responsible for youth from birth through age 17 and with the full array of psychiatric diagnoses. At that time, there were more than 50 manualized interventions designated as ‘‘evidence based’’ for children, youth, and families (SAMHSA, 2005). Most of these had been studied with a narrowly defined population of youth, and very few had been tested in community-based settings, with the complexity of presentation typical at these agencies. There were also broad areas of both population and problems for which there was no appropriate manualized intervention. Given the diversity of providers in the system and the limitations to the evidence base for children at that time, the Council was faced with selecting an overarching strategy to move the system forward. It could choose one or a few of the manualized interventions to train providers on; thus considering evidence-based practice as a noun. Alternatively, the Council could conceptualize evidence-based practice as a process involving the basic tenants of science, as discussed by Gibbs and Gambrill (2002), and Shlonsky and Gibbs (2004), and thus consider evidence-based practice as a verb. After considerable discussion, the Council decided to promote the process of evidence-based practice, that is, performing an individualized assessment with the child and their family, identifying a client-specific question, looking to the research literature for guidance on the most scientifically endorsed approach to solving clients’ problems, discussing the findings with families, making a decision on which approach to take, implementing the intervention, and evaluating relevant outcomes. Clarifying the concept. After making the decision to pursue the ‘‘process’’ of evidence-based practice, it became clear that there was a great deal of confusion

171

in the Illinois service system regarding what the phrase ‘‘evidence-based practice’’ meant. The Council decided to adopt the phrase ‘‘evidence-informed practice’’ as a means of conveying the meaning of the process with greater clarity to the system. Being highly cognizant of the impact the definition of evidence-informed practice would have on implementation strategy, the Council spent four months deliberating over exactly how evidence-informed practice would be defined in Illinois. The final consensus definition was as follows: Evidence-Informed Practice is a collaborative effort by children, families, and practitioners to identify and implement practices that are appropriate to the needs of the child and family, reflective of available research, and measured to ensure the selected practices lead to improved meaningful outcomes.

Strategy for change. Following the adoption of this definition, the Council spent several months designing a multipronged strategic plan focused on moving the system toward evidence-informed practice. It was clear that a plan to make significant change in the entire child-serving system would need to be broad, ambitious, and sustained over a significant period. The five strategic steps identified to achieve these changes were as follows: 1. Educate DMH and community mental health agency leadership on evidence-informed practice through a series of seminars. 2. Train providers on evidence-based clinical skill sets. 3. Create partnerships with university programs that train the upcoming workforce to support evidence-informed practice in the future, leading to sustainability. 4. Provide information on evidence-informed practice to parent consumers in order to drive system change from both above and below. 5. Address the multilevel public policy implications necessary to provide ongoing support for evidenceinformed practice in the system. The remainder of this article focuses on the second strategy, training providers on evidence-based skill sets.

METHOD Sample and Setting The child sample was designed to reflect the usual case assignment process utilized in mental health agencies. Youth age 5 through 17 were assigned to training and treatment-as-usual (TAU) comparison clinicians in consecutive order by presentation to agencies. The inclusion

172

STARIN ET AL.

criterion was an Ohio Scales Worker version score of 20 or above, putting the client in the ‘‘clinical’’ range of the measure. As noted in Table 1, the three groups of cohorts comprised similar proportions of male and female children and ages. Specifically, Cohort 1 (n ¼ 89) had 56.2% male participants with an average age of 11.84. Cohorts 2 to 4 (n ¼ 227) had 57.3% male participants with an average age of 11.04. Cohort 5 and 6 (n ¼ 56) had 57.3% male participants and an average age of 9.50. These sex and age demographics were not available for the statewide comparison group for Cohort 5 and 6. Youths cared for by providers in the training were from urban, suburban, and rural areas in Illinois relatively evenly. The families generally utilized public funding to pay for services in these agencies, thus although specific socioeconomic data are not available, the assumption is that most of the children in the sample were living in some level of poverty. DMH committed state funding to secure trainers and consultants, project evaluation, and the staff time for trainees from local community mental health agencies. Participating agencies received $15,000 to $18,000 to support their 12-month involvement in the project. The initiative was able to support participation of 10 community mental health agencies in the 1st year and approximately 10 additional agencies in each subsequent year. All statewide agencies (N ¼ 126) providing services to children under contract with DMH were invited to respond to a competitive request for proposals to participate in the training. Agencies were selected based on their commitment to provide two master’s level staff to participate in the training, one of which was in a supervisory capacity, evidence of agency leadership supportive of the training, and their agreement to participate in the evaluation. The agency directors also selected the TAU clinicians used for comparison in Cohorts 1 to 4. They were required to have master’s-level training and to have approximately the same level of experience as the training clinicians at the same agency. The TAU clinicians were required to agree to submit outcomes data for the project evaluation. Specifically, Cohorts 1 and 2 included 10 agencies and 20 training clinicians, Cohort 3 included nine agencies and 18 training clinicians, Cohort 4 included 12 agencies and 24 clinicians, Cohort 5 included TABLE 1 Demographic Data for Clinical Cases Sex Age Cohort 1 2–4 5–6

n

Male n (%)

Female n (%)

M (SD)

89 227 56

50 (56.2%) 130 (57.3%) 32 (57.3%)

39 (43.8%) 97 (42.7%) 24 (42.7%)

11.84 (3.78) 11.04 (3.50) 9.50 (3.30)

10 agencies and 20 clinicians, and Cohort 6 included five agencies and 10 clinicians. Table 2 depicts the breakout of diagnostic categories of the youth included in the sample. The diagnostic categories reflected in the sample portray the broad range of clinical presentations that were expected in this diverse population. Across cohorts, the primary diagnoses were for disruptive behavior disorder, mood disorder, adjustment disorder, and anxiety disorder. Development of Training Plan As Chorpita and Daleiden (2009) noted, it is impractical to train providers on the large number of manualized interventions necessary to meet the needs of all children receiving mental health services in Illinois, and the Council was faced with this reality as it considered how to respond to the needs noted in the provider survey. Therefore, rather than selecting a particular manualized intervention, the Council sought trainings that were organized around broader theoretical and practice skills of cognitive behavior therapy (CBT), to respond to concerns related to internalizing disorders and behavioral parent training (BPT), to respond to concerns related to externalizing disorders. Training pilot development. The plan to present the CBT and BPT trainings to clinicians was based on the model of implementation proposed by Fixsen, Naoom, Blase´, Friedman, and Wallace (2005). Fixsen et al. described a plan to assist clinicians to utilize training through an extended process involving cultural preparation, information, and ongoing support for practice change. Each of these process steps was incorporated into the training model. The initial cohort (Cohort 1) TABLE 2 Diagnostic Categories for Youth Served Across Cohorts Cohort 1a Diagnostic Category

Total n

Mood 155 Disruptive Behaviora 136 Adjustment 70 Anxiety 59 Reactive Attachment 3 Developmental 2 Disabilities Substance 1 Psychotic Disorders 0 a

Cohort 2–4b

Cohort 5–6c

%

n

%

n

%

24 34 16 12 1 0

27.0% 38.2% 18.0% 13.5% 1.1% 0.0%

120 72 43 40 0 2

52.9% 31.7% 18.9% 17.6% 0.0% 0.8%

11 30 11 7 2 0

19.6% 53.6% 19.6% 12.5% 3.6% 0.0%

0 0

0.0% 0.0%

0 0

0.0% 0.0%

1 0

1.8% 0.0%

n ¼ 89. n ¼ 227. c n ¼ 56. d Attention deficit hyperactivity disorder and externalizing disorders (oppositional defiant disorder and conduct disorder). Percentages for each cohort will exceed 100% due to diagnostic comorbidity. b

EVIDENCE-INFORMED PRACTICE

was considered a pilot intervention to determine whether the training process could be implemented in community mental health agencies and to determine whether there would be evidence of improvement in outcomes for children seen in the course of community practice. The training initiative was planned to last for 12 full months and consisted of 8 days of in-person didactic training and twice-monthly hour-long telephone case consultations. In addition, trainees were asked to do readings on their respective training model selected by the lead trainer. Two university-based, nationally recognized content experts for the didactic training of CBT and BPT were located in Illinois. Both remained involved with the EBTI pilot project for 5 years. Consultation and Ongoing Technical Support In addition to the in-person didactic training on BPT and CBT, agency staff received case consultation and technical support via conference call for 60 min twice a month for 12 months. Each agency team was assigned a clinical consultant from the respective universities with a doctorate degree in psychology or social work and several years’ experience with the respective theoretical model. Each line staff was required to carry eight cases that were being treated using the model they were being trained in, and the supervisor was required to carry four cases. Agency staff were to send videotapes of their sessions monthly, which were reviewed prior to the call by the consultant and discussed during the ensuing call. Clinical consultation models. BPT consultation was informed by the supervision model developed for Multisystemic Therapy (MST; for the supervision protocol, see Henggeler & Schoenwald, 1998), an evidence-based intensive family and community-focused intervention for youths with serious behavior problems (Henggeler, 2011). This supervision model was used in the current initiative because of its structured and problem-solving approach to supporting clinicians’ adherence to evidencebased practices when working with youths and families experiencing multiple clinical problems and risk factors. Clinical consultation focused on technical assistance on the implementation of the EBPs, prioritizing specific child and family problems that need to be addressed in a given case, barriers and constraints to implementation and treatment engagement, and strategies to address these barriers. Consultation also focused on assisting clinicians in transitioning to more structured intervention approaches from a less structured eclectic modality of therapy when necessary (Connor-Smith & Weisz, 2003). The underlying principle for consultation, initially described for MST (Henggeler, Schoenwald, Borduin, Rowland, & Cunningham, 2009), conceptualizes clinicians as committed professionals developing a new set of skills

173

and of families who are in the process of developing or incorporating a new set of skills into their daily interactions with their children. Five principles from MST were incorporated into the clinical consultation component for BPT: (a) taking a strength-based approach to working with families, (b) treatment is present focused and addresses clearly defined problems with EBPs for which progress can be evaluated, (c) emphasizing developmentally appropriate strategies and expectations, (d) emphasizing the need for ongoing commitment of families to engage in treatment and to using the strategies, and (e) focusing on helping families generalize the use of strategies across different settings. The clinical consultation model utilized in the CBT training followed that of Modular CBT Therapy (Curry & Reinecke, 2006). Similar to the model used for BPT, the CBT consultation focused on promoting skill development, case conceptualization, problem-solving clinical barriers through reliance on model principles of change, and modifying interventions to the unique needs of the client. This method involves development of a diverse array of CBT skills and focuses less on following a rigid, sequenced protocol than a strictly manualized approach. The CBT and BPT trainings were adapted and continued for Cohorts 1 to 4. Adoption of a practice model based on common element concepts. Feedback on the first 4 years of the training project was generally very positive. However, there was ultimately recognition that these independent trainings in CBT and BPT did not align with the initial project survey, indicating a need to address the full array of problems that appeared in their clinic. The clinicians trained were essentially specialists in either internalizing (CBT) or externalizing (BPT) disorders. This feedback coincided with the publication of the common elements approach (Chorpita, Daleiden, & Weisz, 2005). The advent of the common elements approach and the development of resources through PracticeWise, LLC (http://www.practicewise.com), offered a prospect to make significant changes to the structure of the EBTI training model and evaluation design. It also offered the opportunity to train clinicians to be evidence-informed ‘‘generalists’’ as the common elements approach covers interventions that are effective for both internalizing and externalizing problems. Therefore, the Council decided to modify the training model to train all participants in Cohorts 5 and 6 on a practice model based on common element concepts. The common elements concept is based on a distillation and matching model that describes how evidencebased treatment operations can be conceptualized at a lower level of analysis than by the manuals they appear in (Chorpita & Daleiden, 2009). The model operates to

174

STARIN ET AL.

identify the specific techniques and procedures that make up evidence-based protocols for specific problem areas (Chorpita et al., 2005). Thus, rather than focusing on internalizing or externalizing based clinical skills, all participants were trained in the 30 common elements skills. Training participants were provided with two of the PracticeWise resources; the PracticeWise Evidence-Based Services, an Internet database that matches population and problems with specific evidence-based interventions, and the Practitioner Guides, which offer detailed techniques on each of the common element interventions. The full array of common element resources was not utilized. The basic structure of the training in terms of didactic time and twice-monthly consultation did not change for Cohorts 5 and 6. However, the expert trainers on CBT and BPT were discontinued, and replaced by the Ph.D.-level consultants who had been responsible previously for only the phone consultation. This provided clearer alliance between the didactic material and the ongoing twice-monthly phone consultation. Evaluation of Initiative The Council recognized that evaluating the outcomes of the training program was a core principle of evidence informed practice. To accomplish this, a mixed-method evaluation was designed that provided qualitative and quantitative data, and which drove ongoing adaptations to the training plan. The evaluation design focused on understanding the impact of training on functional outcomes for youth receiving services and to provide feedback to DMH and the Advisory Council on clinicians’ experiences and the views of agency directors and supervisors to determine the feasibility of the initiative to inform community practice. Independent university-based evaluators, who were not involved in the training, conducted the evaluation.1 Parents and providers completed outcome measures, thus both a clinical and a consumer perspective on the child’s functioning was gathered. The intake and 90-day scores from outcomes measures were selected for inclusion in this project. In the first cohort, the Child and Adolescent Functioning Assessment Scale (CAFAS) was used as the outcome measure. The CAFAS was discontinued after the first cohort in favor of the Ohio Scale-Worker version (Ogles, Melendez, Davis, & Lunnen, 2000) and the Columbia Impairment Scale–Parent version (Bird 1 In addition to the assessment of functional scales associated with client outcomes, evaluators also collected qualitative data related to agency organizational culture and preparation to adopt aspects of EBP, as well as perceived benefits, barriers, and plans for integration with ongoing services at their agencies. Presentation of these analyses are beyond the scope of this article but are described in a separate manuscript.

et al., 1993) which were utilized for the remaining cohorts. For Cohorts 2 through 4, the Ohio and Columbia Impairment scale scores were gathered for both the training and the TAU comparison clinicians in each agency. Just prior to the beginning of Cohort 5 of the initiative, DMH adopted a statewide policy requiring all clinicians to participate in outcomes measurement in order to be able to bill for their services. This was a significant state policy adaptation driven by the advisory council. Specifically, all 126 child-serving agencies in Illinois were required to utilize a personalized web-based outcomes system (DatStat; http://www.datstat.com). This system utilized the same functional outcome instruments, Ohio and Columbia Impairment scales, that were used in the EBTI training pilots. This allowed the EBTI initiative to eliminate the interagency TAU comparison group from the evaluation model for Cohorts 5 and 6 and to compare the training data with the larger state data from the Datstat system. This avoided the possibility of drift within agencies, which was especially important because beginning in Cohort 3, the Council required agencies competing to participate in the EBTI training to submit a plan to disseminate the EBTI training to additional clinicians within their agency. It was noted that beginning in Cohort 3 the outcome data became slightly less distinguished between the training and the TAU groups. Agencies reported qualitatively that dissemination was occurring within the agency. Dependent Measures CAFAS. The CAFAS is a clinician completed multidimensional measure of functional impairment in children ages 5 to 19 years in eight domains of functioning (Hodges & Wong, 1996). The child domains include Role Performance, Thinking, Behavior Toward Others= Self, Moods=Emotions, and Substance Abuse. Two additional domains evaluate availability of material resources and social support. Internal consistency estimates ranged from a ¼ .63 to .68 (Hodges & Wong, 1996). Columbia impairment scale. The Columbia Impairment Scale is a 13-item functional assessment instrument covering specific domains of functioning in children ages 5 to 18 (Bird et al., 1993). In the EBTI initiative the Columbia Impairment Scale was completed by the youth’s parent or caregiver. Internal consistency has been shown to be moderately high (r ¼ .82–.89). Ohio scales. The Ohio Scales is a broad measure of functioning for youth age 5 to 18. It includes subscales addressing problem areas (a ¼ .86) and another addressing positive areas (a ¼ .91) of functioning (Ogles et al., 2000). Each subscale contains 20 questions, and the worker version is completed by a clinician.

EVIDENCE-INFORMED PRACTICE

The Problem subscale can be divided into internalizing and externalizing scores. RESULTS Data Analysis Table 3 provides the average CAFAS or Ohio Scale scores at intake and at 90 days, along with the pairedsamples t-test scores and Cohen’s d effect sizes. For Cohorts 2 to 6, the Ohio problem scores were split into internalizing and externalizing symptoms as part of the analysis to identify differences between the impact of the CBT model versus the BPT model for each cohort. Table 3 also provides fiscal year 2010 data for statewide averages provided by DMH for both intake and at 90 days as well as the paired-samples t tests and effect sizes; state fiscal year 2010 was the only year for which these data were available. These statewide data were used as comparison data for Cohorts 5 and 6. As evident in Table 3, statistically significant changes associated with the EBTI training between the intake and 90-day assessments were apparent in nearly all conditions, with the exception of the CBT group as measured by the CIS-P in Cohorts 2 to 4. On effect size estimates of pre–post changes, in Cohort 1, the training

175

groups show stronger effect sizes that are almost twice as large as the TAU comparison group (Cohen’s d ¼ 1.17, 1.16, and .83 for BPT, CBT, and TAU comparison groups, respectively). In Cohorts 2 to 4, the BPT group had a strong effect size on improving internalizing problems (d ¼ .1.04), as compared to a the effect for the TAU comparison group (d ¼ .80), and similar scores on externalizing problems (d ¼ .89 and .93 for BPT and TAU comparison group, respectively). However, the CBT group had moderate and noticeably smaller effects relative to the BPT and TAU comparison groups on internalizing scores (d ¼ .72), and smaller effects relative to BPT on externalizing scores (d ¼ .61) that were similar to TAU comparison scores. Scores on youth functioning revealed a similar pattern. The CBT group evidenced a medium effect relative to large effect sizes for the BPT and TAU comparison groups (d ¼ .54, .82, and .77, respectively). On parent report, the BPT group evidenced a high effect size that was higher than the CBT and TAU comparison groups (d ¼ 1.06, .5, and .64, respectively). In Cohorts 5 and 6, the training changed from the CBT and BPT models to a practice model based on common element concepts. In addition, the TAU comparison groups were dropped from analysis as statewide outcome data became available. As indicated in Table 3, the modified common elements approach outperformed the

TABLE 3 EBTI Comparison With Statewide Averages of Percentage Improvement from Intake to 90 Days

Cohort 1 1 1 2–4 2–4 2–4 2–4 2–4 2–4 2–4 2–4 2–4 2–4 2–4 2–4 5–6 5–6 5–6 5–6 5–6 5–6 5–6

Measure CAFAS CAFAS CAFAS OYP-I OYP-I OYP-I OYP-E OYP-E OYP-E OYP-F OYP-F OYP-F CIS-P CIS-P CIS-P OYP-I OYP-I OYP-E OYP-E OYP-F OYP-F CIS-P

Group CBT BPT TAU CBT BPT TAU CBT BPT TAU CBT BPT TAU CBT BPT TAU CE SW CE SW CE SW CE

n 32 33 24 74 40 53 74 40 53 72 40 52 57 30 44 56 3,960 55 3,960 54 3,960 46

Intake M (SD) 7.65 9.96 9.38 15.00 10.80 13.83 17.66 23.08 18.57 41.95 39.88 42.66 27.41 28.97 27.62 13.88 9.07 18.00 14.70 40.75 46.05 25.97

(4.45) (5.12) (6.01) (7.76) (5.31) (7.21) (8.31) (14.18) (7.06) (11.05) (11.72) (12.82) (8.61) (7.62) (9.18) (7.00) (6.90) (6.30) (8.86) (8.61) (11.95) (6.97)

90 Days M (SD) 4.57 6.62 7.81 10.90 6.58 10.31 14.65 16.03 13.30 46.01 44.87 48.02 25.38 22.80 24.50 9.61 7.45 14.56 12.28 47.22 48.43 22.01

(3.47) (4.68) (5.99) (6.60) (4.20) (6.12) (8.35) (7.26) (5.69) (14.05) (10.06) (11.91) (8.62) (8.39) (8.78) (6.80) (6.25) (6.66) (8.15) (10.36) (12.08) (8.99)

t 

5.34 5.36 1.99 4.79 5.28 4.20 3.43 3.82 5.67 2.58 3.27 3.75 1.74 4.10 2.19 3.93 16.67 3.55 19.33 3.81 13.94 3.14

df

Cohen’s d

31 32 23 73 39 52 73 39 52 71 39 51 56 29 43 55 3959 54 3959 53 3959 45

1.17 1.16 0.83 0.72 1.04 0.80 0.61 0.89 0.93 0.54 0.82 0.77 0.50 1.06 0.64 0.76 0.18 0.73 0.20 0.76 0.17 0.75

95% CI 0.40, 0.39, 0.08, 0.21, 0.35, 0.20, 0.10, 0.19, 0.33, 0.01, 0.12, 0.15, 0.10, 0.26, 0.03, 0.16, 0.11, 0.13, 0.13, 0.16, 0.10, 0.09,

5.03 4.78 4.72 1.73 3.52 2.24 1.51 2.95 2.59 1.40 2.73 2.17 1.48 4.76 2.06 2.05 0.26 2.00 0.27 2.10 0.24 2.28

Note: CAFAS ¼ Child and Adolescent Functioning Assessment Scale; OUP-I ¼ Ohio Youth Problem Scale Internalizing Disorders; OUP-E ¼ Ohio Youth Problem Scale Externalizing Disorders; OUP-F ¼ Ohio Youth Problem Scale Functioning; CIS ¼ P ¼ Columbia Impairment Scale Parent completed; TAU ¼ treatment-as-usual comparison group; CE ¼ common elements; SW ¼ statewide reported from fiscal year 2010.  p < .01.  p < .001.

176

STARIN ET AL.

statewide data across internalizing (d ¼ .76 vs. .18), externalizing (d ¼ .73 vs. .20), and functioning (d ¼ .76 vs. .17) scores, with effect sizes sometimes greater than twice as large. On parent report, the modified common elements approach revealed a moderately high effect (d ¼ .75). No parent report data were available statewide.

DISCUSSION In this article, we describe the development, implementation, and evaluation of the training component of the Illinois Evidence Based Practice Initiative, an ongoing project initiated and led by the DMH. Although there is widespread agreement on the importance of implementing evidence-based practices in community practice, there are few descriptions of the process to guide large state organizations. The results indicated moderate to strong effects for the evidence-based training, especially for the BPT training model. A shift to a practice model based on common element concepts was necessary to inform interventions for children with internalizing disorders, as these were not generally addressed by the BPT intervention, and CBT training results indicated less robust effects. Results from the last two cohorts using the modified common elements approach indicated superior effects relative to statewide averages but generally less strong effects than found for the BPT intervention. There are several possible explanations for the strength of the BPT intervention in comparison to the other two evidence-based interventions. First, this conforms to a larger literature that supports the importance of improved parenting as a core goal especially in a statewide system of community-based care (Hoagwood et al., 2010). This may explain why even internalizing symptoms indicated improvement for the BPT relative to the CBT conditions. Second, the BPT intervention required new intervention strategies for the largely office-based, child-focused clinicians. Thus, to the extent that the training successfully engaged the clinicians, the interventions were likely to follow closely the parenting interventions that were taught. In contrast, the CBT intervention, which was largely child focused, required less change for the clinicians and therefore may have been implemented less faithfully. In regard to the slightly less robust effects for the modified common elements approach, it should be noted that effect sizes for this training initiative (d ¼ .73–.76 are above the range of effectiveness trials (average d ¼ .30) as noted in a recent review by Weisz, Ugueto, Cheron, and Herren (2013). Similarly, a recent application of the common elements model as compared to usual care evidenced effect size estimates based on parent report of internalizing and externalizing behaviors that ranged

between .48 to .59 (Weisz et al., 2012), which is below the range found in our study. Thus, the Illinois results are encouraging relative to other effectiveness studies. In addition, the reason for the move from singlemodel training to a modified common elements training reflected the realities of community-based care and the need for a workforce that is capable of managing the variety and severity of problems presented. Thus, although the results reported here are encouraging, there are plans for modification of this modified common elements approach in several ways. First, Illinois DMH plans to broaden access to e-learning technologies to support the long-term use of evidence-informed practices (Bryk, Camburn, & Louis, 1999; Vescio, Ross, & Adams, 2008). Specifically, in addition to continuing use of the Practicewise LLC resources, DMH is collaborating with Division 53 (Division of Clinical Child and Adolescent Psychology) of the American Psychological Association to include expert workshops that they are disseminating on a variety of evidence based practices (http://www. effectivechildtherapy.com). These will be available to all DMH-sponsored mental health providers on a website maintained by DMH. Second, DMH will continue its association with DatStat, a web-based platform, to oversee the outcome monitoring system and to integrate PracticeWise resources into the system so that all practitioners have access to a comprehensive database supporting evidence informed practices. Third, DMH is planning to implement Professional Learning Communities (PLC) to promote the long-term use of evidence informed practices. The PLCs will promote a peer support network of practitioners, potentially increasing the social network and social capital of providers, to increase the use and support of evidence informed practices (Burt, 1999; Neal, Neal, Atkins, Henry, & Frazier, 2011). As planned, PLCs will provide ongoing support and booster sessions for previously trained providers and provide a platform to train new staff, allowing for long-term sustainability of evidence-informed practices across Illinois. DMH is also planning to develop PLCs for supervisors, recognizing that supervisors can be the crux of implementation and sustaining of innovative services (Rapp et al., 2010). Supporting supervisors potentially increases the sustainability and long-term impact of training and ongoing support within agencies because supervisors tend to stay with an agency longer than front-line staff. Fourth, in addition to supporting current practitioners with continuing education programs, DMH also plans to continue collaboration with the state university system to support training of evidence-informed practices in the developing workforce. To date, DMH has supported evidence-based child and adolescent certificate programs at three public universities in their social work or counseling master’s-level programs.

EVIDENCE-INFORMED PRACTICE

Finally, to continue its commitment to evidenceinformed practices, Illinois DMH plans to address remaining organizational barriers faced by staff and agencies to the implementation of evidence-informed practices through comprehensive changes at multiple levels: legislative=policy (inclusion in strategic plan), individual (training and support), and organizational (support for agencies to create an open culture to implement evidence-based practices). The commitment to continuing in this multipronged approach owes to the success experienced thus far bringing science into the children’s mental health system through a process guided by implementation research. From a policy perspective, we suggest several themes to guide other states and municipalities. First, the role of the Illinois Evidence-Based Practice Advisory Council was critical to guide the project, enlist community and other stakeholder support, and maintain state-level support. It is worth highlighting that this project endured during a time of heightened financial crisis for the state. This meant that the initiative was compelled to elicit political support through presentation of outcomes data to maintain project funding and viability. Second, the composition of the Council that guided the initiative was highly diverse to ensure buy-in from multiple constituencies. This allowed the Council leadership to promote a dialogue among participants that resulted in a consensus definition (evidence-informed practice), that incorporated all perspectives without detracting from the original mission to improve mental health practices for Illinois’s most vulnerable children and families. The parent consumer participation was particularly grounding in these discussions. Most important, the definition incorporated stakeholders’ concerns for what constituted evidence and especially the importance of incorporating family input to determine that practices were acceptable and child gains meaningful. Third, the trainers were given wide discretion to respond to the needs of the trainees, and early in training it became clear that flexible models were more practical than learning one or more manualized treatments. For example, consultants’ discussions across cohorts involved considerations of the effects of poverty and deprivation on family ability to implement recommended strategies to enhance client engagement and encouraged open discussion of often lack of fit of evidence-based practices to community mental health populations. Thus, rather than be a barrier to implementation, these discussions served as opportunities to consider how to adapt interventions to meet the unique needs of families. Fourth, although specific demographics regarding the racial and ethnic diversity of the sample were not available, the population (about 40,000 youth served in Illinois per year) is very diverse. This leads to the possibility of designing evaluation plans in the future

177

that gather specific sample descriptors such that comparisons across categories could be obtained. This training initiative was not designed to be a research project; rather it was designed to test methods of using science in a real-world, community mental health setting. Fifth, and perhaps the most important innovation that arose from this initiative, was the use of standardized clinical outcome measures in all publicly funded cases statewide. This marked a significant step toward moving science into the children’s mental health system on a macrolevel. Assessments were mandated every 90 days, which remains in place and has been used in far more than 40,000 clinical cases. As new practices are initiated, either statewide or by agency initiative, these data are available to assess progress for each case as well as program-wide. Finally, to continue its commitment to evidence-informed practices, Illinois DMH plans to address remaining organizational barriers faced by staff and agencies to the implementation of evidenceinformed practices through comprehensive changes at multiple levels: legislative=policy (inclusion in strategic plan), individual (training and support), and organizational (support for agencies to create an open culture to implement evidence-based practices). The commitment to continuing in this multipronged approach is reflected in the state’s recent strategic plan (Illinois Department of Human Services, 2013), which owes to the success experienced thus far bringing science into the children’s mental health system through a process guided by implementation research.

REFERENCES Aarons, G. A. (2004). Mental health provider attitudes toward adoption of evidence-based practice: The Evidence-Based Practice Attitude Scale (EBPAS). Mental Health Services Research, 6, 61–74. doi:10.1023=B:MHSR.0000024351.12294.65 Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health, 38, 4–23. doi:10.1007=s10488-010-0327-7 Bird, H. R., Shaffer, D., Fisher, P., Gould, M. S., Staghezza, B., Chen, J., & Hoven, C. (1993). The Columbia impairment scale: Pilot findings of a global impairment for children and adolescents. International Journal of Methods in Psychiatric Research, 3, 167–176. Bryk, A., Camburn, E., & Louis, K. S. (1999). Professional community in Chicago elementary schools: Facilitating factors and organizational consequences. Educational Administration Quarterly, 35, 751–781. doi:10.1177=0013161x99355004 Burt, R. S. (1999). The social capital of opinion leaders. The Annals of the American Academy of Political and Social Science, 566, 37–54. doi:10.1177=0002716299566001004 Chorpita, B. F., & Daleiden, E. L. (2009). Mapping evidence-based treatments for children and adolescents: Application of the distillation and matching model to 615 treatments from 322 randomized trials. Journal of Consulting and Clinical Psychology, 77, 566–579. doi:10.1037=a0014565

178

STARIN ET AL.

Chorpita, B. F., Daleiden, E., & Weisz, J. R. (2005). Identifying and selecting the common elements of evidence based interventions: A distillation and matching model. Mental Health Services Research, 7, 5–20. doi:10.1007=s11020-005-1962-6 Connor-Smith, J. K., & Weisz, J. R. (2003). Applying treatment outcome research in clinical practice: Techniques for adapting interventions to the real world. Child and Adolescent Mental Health, 8, 3–10. doi:10.1111=1475-3588.00038 Curry, J. F., & Reinecke, M. A. (2006). Modular therapy for adolescents with major depression. In M. A. Reinecke, F. M. Dattilio & A. Freeman (Eds.), Cognitive therapy with children and adolescents, (4th ed., pp. 95–127). New York, NY: Guilford. Davis, D., Thomson O’Brien, M. A., Freemantle, N., Wolf, F. M., Mazmanian, P., & Taylor-Vaisey, A. (1999). Impact of formal continuing medical education: Do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? Journal of the American Medical Association, 282, 867–874. doi:10.1001= jama.282.9.867 Fixsen, D. L., Naoom, S. F., Blase´, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature (FMHI Publication #231). Tampa: University of South Florida, The National Implementation Research Network. Gellis, Z., & Reid, W. J. (2004). Strengthening evidence-based practice. Brief Treatment and Crisis Intervention, 4, 155–165. doi:10.1093= brief-treatment=mhh012 Gibbs, L., & Gambrill, E. (2002). Evidence-based practice: Counterarguments to objections. Research on Social Work Practice, 12, 452–476. doi:10.1177=1049731502012003007 Glisson, C., & James, L. R. (2002). The cross-level effects of culture and climate in human service teams. Journal of Organizational Behavior, 23, 767–794. doi:10.1002=job.162 Henggeler, S. W. (2011). Efficacy studies to large-scale transport: The development and validation of multisystemic therapy programs. Annual Review of Clinical Psychology, 7, 351–381. doi:10.1146= annurev-clinpsy-032210-104615 Henggeler, S. W., & Schoenwald, S. K. (1998). The MST supervisory manual: Promoting quality assurance at the clinical level. Charleston, SC: MST Services. Henggeler, S. W., Schoenwald, S. K., Borduin, C. M., Rowland, M. D., & Cunningham, P. B. (2009). Multisystemic therapy for antisocial behavior in children and adolescents (2nd ed.). New York, NY: Guilford. Hoagwood, K. E., Cavaleri, M. A., Olin, S. S., Burns, B. J., Slaton, E., Gruttadaro, D., & Hughes, R. (2010). Family support in children’s

mental health: A review and synthesis. Clinical Child and Family Psychology Review, 13, 1–45. doi:10.1007=s10567-009-0060-5 Hodges, K., & Wong, M. M. (1996). Psychometric characteristics of a multidimensional measure to assess impairment: The child and adolescent functional assessment scale. Journal of Child and Family Studies, 5, 445–467. doi:10.1007=BF02233865 Illinois Department of Human Services. (2013). Illinois mental health 2013–2018: Strategic plan. Springfield, IL: Author. Neal, J. W., Neal, Z. P., Atkins, M. S., Henry, D. B., & Frazier, S. L. (2011). Channels of change: Contrasting network mechanisms in the use of interventions. American Journal of Community Psychology, 47, 277–286. doi:10.1007=s10464-010-9403-0 Ogles, B. M., Melendez, G., Davis, D. C., & Lunnen, K. M. (2000). The Ohio youth problem, functioning and satisfaction scales: Technical manual. Retrieved from http://mentalhealth.ohio. gov/assets/consumer-outcomes/instruments/ohio-scales-technicalmanual.pdf Raghavan, R., Bright, C. L., & Shadoin, A. L. (2008). Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implementation Science, 3, 26–36. doi:10.1186= 1748-5908-3-26 Rapp, C. A., Etzel-Wise, D., Marty, D., Coffman, M., Carlson, L., Asher, D., & Holter, M. (2010). Barriers to evidence-based practice implementation: Results of a qualitative study. Community Mental Health Journal, 46, 112–118. doi:10.1007=s10597-009-9238-z SAMSHA. (2005). National registry of evidence-based programs and practices (NREPP). Retrieved from http://www.nrepp.samhsa.gov Shlonsky, A., & Gibbs, L. (2004). Will the real evidence-based practice please stand up? Teaching the process of evidence-based practice to the helping professions. Brief Treatment and Crisis Intervention, 4, 137–153. doi:10.1093=brief-treatment=mhh011 Vescio, V., Ross, D., & Adams, A. (2008). A review of research on the impact of professional learning communities on teaching practice and student learning. Teaching and Teacher Education, 24, 80–91. doi:10.1016=j.tate.2007.01.004 Weisz, J. R., Chorpita, B. F., Palinkas, L. A., Schoenwald, S. K., Miranda, J., Bearman, S. K., . . . Gibbons, R. D. (2012). Testing standard and modular designs for psychotherapy treating depression, anxiety, and conduct problems in youth: A randomized effectiveness trial. Archives of General Psychiatry, 69, 274–282. doi:10.1001= archgenpsychiatry. 2011.147 Weisz, J. R., Ugueto, A. M., Cheron, D. M., & Herren, J. (2013). Evidence-based youth psychotherapy in the mental health ecosystem. Journal of Clinical Child and Adolescent Psychology, 42, 274–286. doi:10.1080=15374416.2013.764824

Moving science into state child and adolescent mental health systems: Illinois' evidence-informed practice initiative.

In 2005, the Illinois State Mental Health Authority embarked on an initiative to close the gap between research and practice in the children's mental ...
474KB Sizes 0 Downloads 0 Views