RESEARCH AND PRACTICE

Use of Research Evidence in State Policymaking for Childhood Obesity Prevention in Minnesota Sarah E. Gollust, PhD, Hanna A. Kite, MPH, Sara J. Benning, MLS, Rachel A. Callanan, JD, MNM, Susan R. Weisman, JD, and Marilyn S. Nanney, PhD, MPH, RD

Increased recognition of childhood obesity as a public health crisis has spurred numerous legislative and regulatory proposals at the national, state, and local levels. Experts agree on the need for better alignment of the evidence base for obesity prevention with the evidence needs of decision makers.1,2 Unfortunately, the pace of research translation is slow relative to the scope and urgency of the problem, with 17% of US youths currently clinically obese,3 although small declines in obesity rates have recently been observed.3,4 A comprehensive evidence-based policy approach is needed to continue to make a meaningful impact.5,6 However, researchers have shown that policymaking is infrequently supported by research evidence; instead, lawmakers balance personal experience, constituent concerns, anecdotes, values, and political expediency to make policy decisions.5,7---9 Although these themes of policymaking are well known, little information is available to quantify how frequently research evidence—in comparison with other types of non---researchbased persuasive appeals—is cited in policyrelevant discourse. We designed this study to fill this gap. Many researchers have observed that researchers and policymakers live in different universes,7 posing major challenges to the process of effectively translating research evidence to policymakers.8 For instance, researchers work on different time horizons and have different incentives than policymakers, leading to research that is ill timed for the legislative cycle and synthesized in language that is appropriate for academic settings rather than being accessible to most policymakers.7,10 Yet, literature in the growing field of knowledge translation has highlighted recommendations to promote more consistent use of evidence by policymakers. For instance, research on state policymakers’ perceptions of the role of evidence in health policy has offered generic tips

Objectives. We describe how scientific evidence about obesity has been used in Minnesota legislative materials to understand how research evidence might more effectively be translated into policymaking. Methods. We selected 13 obesity-related bills introduced from 2007 to 2011 in Minnesota. Using state archives, we collected all legislative committee meeting materials and floor testimony related to each bill. We used a coding instrument to systematically analyze the content of a sample of 109 materials for their use of research evidence and non–research-based information. Results. Research evidence was mentioned in 41% of all legislative materials. Evidence was often used to describe the prevalence or consequences of obesity or policy impacts but not to describe health disparities. In 45% of materials that cited evidence, no source of evidence was indicated. By contrast, 92% of materials presented non–research-based information, such as expert beliefs, constituent opinion, political principles, and anecdotes. Conclusions. Despite an abundance of available research evidence on obesity, less than half of legislative materials cited any such evidence in discussions around obesity-related bills under consideration in Minnesota. (Am J Public Health. 2014;104: 1894–1900. doi:10.2105/AJPH.2014.302137)

about how to better disseminate evidence to have a more meaningful impact on the policy process, such as ensuring that the evidence is locally relevant and timely, identifies specific policy implications, and is presented in easy-tograsp written formats or, especially, conveyed in person.7,11---13 Although these generic recommendations have value, other research has suggested that the specific political and economic context of the particular policy domain affects the ways in which evidence is used in that domain,11,14,15 suggesting that obesity-related policy demands specific scrutiny. In addition, although many previous studies on translating research to policymakers have a national scope,13,16---18 much policy action on childhood obesity prevention occurs at the state level.19 Thus, a focused analysis of state-level legislative discussion can help inform understanding of the evidence translation process for obesity policymaking.18 Developing actionable recommendations for improving the use of evidence in policymaking will require considering the specific context of childhood obesity and

1894 | Research and Practice | Peer Reviewed | Gollust et al.

attending to variation in the types of policy arenas encompassed (e.g., nutrition policy, physical activity policy) under the broad domain of obesity prevention.

RESEARCH OBJECTIVES The primary objective of this research was to quantify the extent to which research evidence and other non---research-based information have been used in materials related to childhood obesity prevention bills introduced in Minnesota between 2007 and 2011. To understand opportunities to better tailor research evidence to the particular policy environment in Minnesota, we also examined the most frequent sources of research evidence cited, the types of research evidence used, and differences in research evidence citation by characteristics of the proposed legislation. Specifically, we asked whether presentation of research evidence varied for legislation concerning the nutrition policy area compared with active living or physical activity---related policy, whether presentation of research evidence

American Journal of Public Health | October 2014, Vol 104, No. 10

RESEARCH AND PRACTICE

differed by whether the information was presented orally or written or whether the goal of the document or testimony was to provide descriptive background information or to recommend a specific policy, and finally whether materials associated with bills that passed the legislature used evidence more or less than materials for bills that did not pass. We conducted a content analysis of legislative materials to answer these research questions.

2007

METHODS

2009

The content analysis process proceeded in multiple stages: sampling policy events, collecting materials, sampling materials, developing a coding scheme, applying the coding scheme to collect data on each material, and analyzing the data.20,21

TABLE 1—Minnesota Obesity-Related Proposed Legislation Selected for Study Inclusion: 2007–2011 Year and Bill Topic

Bill Status

School meal reimbursement increase

Passed, signed by governor

Physical education standards and school wellness policies

Passed, vetoed by governor

2008 State Health Improvement Program (community grants) BMI screening and nutrition education in schools

Passed, signed by governor Did not pass

Physical education standards

Did not pass

School meal reimbursement increase

Did not pass

Farm to school program funding

Did not pass

School siting

Passed, signed by governor

2010 Physical education standards Complete Streets

Passed, signed by governor Passed, signed by governor

2011

Sampling Policy Events

Personal responsibility in food consumption act (reduced liability to food producers)

Passed, vetoed by governor

To identify key obesity legislation in Minnesota, the research team conducted 11 in-person key informant interviews with obesity prevention advocates in the state during the summer of 2011, selected through study team guidance and snowball sampling on the basis of their familiarity with legislative events. The interview topics concerned interviewees’ recollections of salient obesity prevention policy events since 2007 as well as their reflections on how evidence was used in those legislative debates (as preparation for a follow-up study). Interviews lasted about 45 minutes. After reviewing interview notes, transcripts, or both and information on the Minnesota state legislature and National Conference of State Legislatures Web sites, we constructed a timeline of all obesity-relevant legislation introduced in Minnesota between 2007 and 2011, the last year for which we had complete information. We then purposively sampled 13 of the bills for analysis (Table 1), selecting bills to be roughly divided according to principal policy area: 6 bills targeted nutrition or general obesity (school meal reimbursement, community grants for obesity prevention, body mass index measurement and nutrition education in schools, farm-to-school programs, liability against food companies), and 7 bills targeted active living environments or physical activity (physical education standards in schools, policy

Joint use agreements for reduced liability for school facilities

Passed, signed by governor

Safe Routes to School state fundinga

Did not pass

Note. BMI=body mass index. We selected these 13 bills from a timeline of 2007–2011 legislative events that we compiled through key informant interviews and assessments from the National Conference of State Legislatures. The full timeline consisted of a total of 33 bills introduced in these 5 years concerning health care reform, agriculture (e.g., promotion of local foods), food production, active transportation, and school-based health promotion. We selected included bills on the basis of bill timing (at least 2 bills in each year), distribution of success and failure, and diversity in approach to addressing obesity. a Although originally part of our original study design strategy, there were no archived legislative materials associated with the introduction of Safe Routes to School funding in 2011 (it was reintroduced in 2012), and so no materials associated with this legislative event ended up in our analytic sample of materials.

surrounding school siting, Complete Streets, Safe Routes to Schools). For 1 bill (Safe Routes to Schools), no supporting materials were available because it was introduced in 2011 with little legislative discussion, so no materials associated with this proposed legislation were included in the final analytic sample.

Collecting and Sampling Materials We used the archives at the Minnesota Legislative Library and the Web site of the Minnesota state legislature to identify every material associated with each selected legislation. We defined materials as any document circulated at bill hearings (including letters, media articles, fact sheets, legislative research analyses, reports) as well as every archived audio or video testimony from hearings. We next created a database of every document and testimony associated with each bill, labeled by date and with every material assigned a unique identifier. The total sample size of all materials

October 2014, Vol 104, No. 10 | American Journal of Public Health

collected after eliminating duplicate materials, corrupt audio files, and materials clearly unrelated to the obesity prevention aspect of included bills (i.e., nonrelated aspects of omnibus or otherwise comprehensive bills) was 200. Given the time intensity of analyzing the oral testimony, the sheer volume of testimony relative to other types of materials (130 of the 200 materials collected were testimony), and our interest in comparing print and oral modes of legislative discourse, we took a 50% random sample of testimony, yielding 65 testimony materials in the final sample and 70 nontestimony materials (n = 135). Finally, we excluded the text of the 13 bills (including 13 House and 13 Senate versions, or 26 materials) from analysis because, per Minnesota legislative customs, no preamble or introductory text is used in bills (so content was strictly legal with no additional descriptive information that would meet study requirements for coding). The final sample size was 109 materials.

Gollust et al. | Peer Reviewed | Research and Practice | 1895

RESEARCH AND PRACTICE

Constructing a Coding Instrument

Collecting and Analyzing Data

The next stage in the process was to develop a coding instrument to collect data on each document and testimony material in the sample. The coding instrument consisted of 3 parts: (1) descriptive information about the material (e.g., length, date, policy issue area, type of material, whether the material included descriptive information or a specific call to support or oppose the bill), (2) information about use of evidence in the documents (e.g., type of evidence cited such as prevalence or disparities data, source of evidence cited, if available; format of evidence citations), and (3) information about use of non---research-based information (e.g., anecdotes, mention of political values, mention of policies in other states). See the Appendix (available as a supplement to the online version of this article at http://www. ajph.org) for the complete coding instrument. Most of the variables collected in the instrument were dichotomous: the information was either present in or absent from the material. Based on team consensus, we defined research evidence as data (both qualitative and quantitative) that came from a systematic investigation designed to develop or contribute to generalizable knowledge. The investigations could include randomized controlled trials, cohort or case-control studies, observational studies, and general epidemiological surveys. We also defined specific indicators to identify use of research evidence, including citing a research study or researchers; footnoting a research study; using specific numbers or time spans or comparative terms such as “odds,” “likelihood,” “rates,” “risk,” or “significantly” to describe those specific numbers; using the word “data”; and paraphrasing a finding from a well-known research report, even if unattributed (such as “children in this generation may for the first time have a shorter life expectancy than their parents”22). After establishing the key variables of interest, we created a coding definition book that included examples and definitions for all variables. Through an iterative process of pilot coding and team deliberation, we established the reliability of our coding instrument. Documents were divided between 2 coders, with a sample of 30 documents double-coded, and our reliability in the double-coded sample was sufficiently high (js > .6023).

After filling out paper coding instruments for each of the 109 materials in the sample, we entered all data into a Web-based database designed for the study. The complete data set was exported into STATA version 10.0 for quantitative analysis (StataCorp, College Station, TX). The analysis included descriptive statistics for all variables in the data set, as well as tests for differences by policy issue area, type of document (oral vs written), intent of document, and success or failure of bills by using the Pearson v2 test. Finally, we estimated a multivariable logistic regression model to examine whether the use of research evidence in documents varied by bill passage, adjusting for potentially confounding variables including policy issue area and obesity focus of bill.

RESULTS Table 2 presents descriptive information about the sample of 109 documents. By design, almost 60% of the sample consisted of testimony (n = 65), of which 68% was available as audio and 32% was available as video. The remaining 40% of the sample was print documents, including news articles, fact sheets, policy briefs, and reports; other documents included letters to officials from stakeholders, local school district policies, legislative research reports, and lists of grants awarded to local organizations. Testimony length varied from 1 minute or less (25% of the sample) to 27 minutes, and about 80% of testimony was less than 6 minutes. Most printed documents were 2 pages or less (73%), and the longest document in the sample was 17 pages. More materials were drawn from 2009 (n = 49) than in the other years because 4 of the selected bills were introduced in 2009 (Table 1). Figure 1 displays the use of research evidence and non---research-based information for the sample over time (use of research evidence and non-research-based information was not mutually exclusive). Overall, 41% of all materials cited research evidence, and 92% cited non---research-based information. We found some variation by year, with more research evidence cited in the last year of the study compared with the earlier years (not statistically significant). The use of non---researchbased information declined from 100% of

1896 | Research and Practice | Peer Reviewed | Gollust et al.

TABLE 2—Descriptive Information About Minnesota Obesity-Related Proposed Legislation Analyzed: 2007–2011 Item Analyzed

No. (%)

Type of material Testimony News articles Fact sheet

65 (59.6) 9 (8.3) 7 (6.4)

Policy brief

2 (1.8)

Research report

1 (0.9)

Meeting minutes Othera

1 (0.9) 24 (22.0)

Year 2007

20 (18.4)

2008 2009

13 (11.9) 49 (45.0)

2010

16 (14.7)

2011

11 (10.1)

Printed document length (n = 44) 1 page

19 (43.2)

2 pages

13 (29.5)

3 page

2 (4.5)

4–6 pages ‡ 7 pages

9 (20.5) 1 (2.3)

Testimony length,b min (n = 60) £1

15 (25.0)

1–3

14 (23.3)

4–6

18 (30.0)

6–9

6 (10.0)

‡ 10

6 (10.0)

‡ 20 Author

1 (1.7)

Nonprofit

29 (26.6)

Legislator

22 (20.2)

Legislative research

13 (11.9)

State agency

9 (8.3)

Journalist

9 (8.3)

School or education representative

8 (7.3)

Coalition University or academic institution

7 (6.4) 6 (5.5)

Private sector

3 (2.8)

Federal agency

1 (0.9)

Celebrity

1 (0.9)

Unknown

1 (0.9) Continued

American Journal of Public Health | October 2014, Vol 104, No. 10

RESEARCH AND PRACTICE

TABLE 2—Continued Purpose of material Descriptive

34 (31.2)

Explicit policy recommendation

75 (68.8)

Note. The sample size was n = 109. a “Other” included letters, school district policies, list of funded grants, and House or Senate research reports. b There were 65 testimony files, but 5 files were missing time-stamp information.

materials in 2007 to 82% of materials in 2011. Although 41% (or 45 materials) offered research evidence, the specific types of obesityrelated evidence they offered varied (Table 3). More research evidence (50%) was cited in materials concerning nutrition policy than in active living materials (36%), although this difference was not statistically significant (P = .16). Most evidence was used to describe the magnitude of obesity or a risk factor (i.e., citing the prevalence of unhealthy diets or the prevalence of obesity) or to describe the impact of a behavior, policy, or program. Significantly more evidence was used to describe the impact of a program in active living---related materials (64%) than in those concerning nutrition issues (25%; P = .009). Evidence was also used to describe the consequences of obesity, with 10 materials—or 40% of all active living

materials that cited evidence—mentioning information on the association between weight and academic outcomes. Research evidence concerning children was mentioned in almost half (47%) of all documents that cited any evidence. No materials presented any research evidence on racial or ethnic groups, groups defined by socioeconomic status, or differences by geography. When presenting research evidence, materials used graphs or figures (11%), tables (20%), or bullet points (22%); some (18%) used footnotes or other citations. Much evidence was reported at the state level, although national data and more local-level data were also presented in materials. Table 4 offers a summary of the types of non---research-based information cited in the sample. The most common types of non--research-based information cited were expert opinion (48%), cost information not generated from a research study (37%), mention of specific politicians who support or do not support an issue (36%), public opinion (33%), political principles or ideology (32%), and narratives or anecdotes (24%). Among the narratives, the most common narrative concerned a community, neighborhood, or school. Few differences existed between materials describing nutrition-related legislation versus active living legislation, although the former more frequently cited analogies to other policy arenas, particularly tobacco policy.

100%

100

100% 90%

90

88% 82%

80

Proportion, %

Next, we examined the sources of research evidence cited. Materials most commonly did not cite a source at all for particular research evidence mentioned (in 45% of materials that cited evidence) or just indicated some generic reference to research such as that it came from a “study” (in 30%). Few (5 materials, or 13%) referred to specific peer-reviewed studies in journals, although more cited federal government sources (23%; e.g., the Centers for Disease Control and Prevention) or state agency sources (20%; e.g., Minnesota Department of Health). Three materials cited a national research institution (such as the National Institutes of Health), 1 material cited a health care organization as source, and 10 cited multiple sources for the same type of evidence. No materials cited local or nonlocal academic institutions, local nongovernmental organizations, industry, the news media, or a state policy assistance organization (e.g., National Conference of State Legislatures) as the source of research evidence. Finally, we explored differences in the frequency of citation of research evidence or non---research information by 3 key variables: document type, intent of document, and bill success or failure. First, we found that oral testimony was less likely to cite research evidence (34%) than printed materials (52%; P = .055), but we found no differences in the likelihood of citing non---research-based information. Next, we found no significant

70 60 50

55% 45% 39%

40

39%

Research evidence

38%

Non–research-based information

30 20 10 0 2007 (20)

2008 (13)

2009 (49)

2010 (16)

2011 (11)

Year (No. of Sampled Documents) Note. Differences by year for research evidence citation and non–research-based citation were not significant (based on the Pearson v2 test).

FIGURE 1—Proportion of sampled obesity-related legislative material from Minnesota citing research evidence and non–research-based information, by year: 2007–2011.

October 2014, Vol 104, No. 10 | American Journal of Public Health

Gollust et al. | Peer Reviewed | Research and Practice | 1897

RESEARCH AND PRACTICE

evidence or not, we found no significant relationship with bill passage once controlling for the policy issue area and obesity focus.

TABLE 3—Types of Research Evidence Cited in Minnesota Obesity-Related Proposed Legislation: 2007–2011

Evidence Type Is any research evidence mentioned? Describes the magnitude of the obesity or

Overall (n = 109), No. (%)

Nutrition (Nn = 40), No. (%)

Active Living (n = 69), No. (%)

Diff. by Policy Type,a P

45 (41.3) 23 (51.1)

20 (50.0) 11 (55.0)

25 (36.2) 12 (48.0)

.159 .641

a risk factor Describes a cause of obesity Individual

5 (11.1)

2 (10.0)

3 (12.0)

.832

Environmental

6 (13.3)

3 (15.0)

3 (12.0)

.769

21 (46.7)

5 (25.0)

16 (64.0)

.009

Mentions a specific policy or program Describes any consequence of obesity

12 (57.1) 28 (62.2)

4 (80.0) 11 (55.0)

8 (50.0) 17 (68.0)

.237 .371

Mentions health consequences

13 (28.9)

8 (40.0)

5 (20.0)

.141

Mentions health care costs

11 (24.4)

6 (30.0)

5 (20.0)

.438

2 (4.4)

2 (10.0)

0 (0.0)

.106

Describes the impact of a policy, program, or behavior change

Mentions impacts on workforce or productivity Mentions academic outcomes

10 (22.2)

0 (0.0)

10 (40.0)

.001

Mentions quality of life or psychosocial consequences

6 (13.3)

1 (5.0)

5 (20.0)

.141

Mentions impact on defense or national security

3 (6.7)

0 (0.0)

3 (12.0)

.109

Target groups described Mentions children

21 (46.7)

9 (45.0)

12 (48.0)

.841

Mentions seniors

4 (8.9)

3 (15.0)

1 (4.0)

.198

Mentions racial or ethnic groups

0 (0.0)

0 (0.0)

0 (0.0)

Mentions socioeconomic groups

0 (0.0)

0 (0.0)

0 (0.0)

Mentions gender differences

2 (4.4)

0 (0.0)

2 (8.0)

Mentions geographic differences/ urban versus rural

0 (0.0)

0 (0.0)

0 (0.0)

5 (11.1) 9 (20.0)

3 (15.0) 3 (15.0)

2 (8.0) 6 (24.0)

.196

Format of research evidence Uses a graph or figure Uses a table Uses a map

.722 .469

0 (0.0)

0 (0.0)

0 (0.0)

10 (22.2)

4 (20.0)

6 (24.0)

.512

8 (17.8)

4 (20.0)

4 (16.0)

.725

National

15 (33.3)

8 (40.0)

7 (28.0)

.396

MN state

22 (48.9)

9 (45.0)

13 (52.0)

.641

MN locality Non-MN state (adjacent)

8 (17.8) 1 (2.2)

2 (10.0) 0 (0.0)

6 (24.0) 1 (4.0)

.222 .366

Non-MN state (nonadjacent)

4 (8.9)

1 (5.0)

3 (12.0)

.412

Unknown

8 (17.8)

2 (10.0)

6 (24.0)

.222

Uses bullet points Uses footnotes or in-text citations Geographic relevance of research evidence

a

Based on the Pearson v2 test.

differences in citation of evidence or non--research-based information for materials that presented descriptive information with no policy recommendation compared with those that included a specific call to action or policy recommendation. Comparing the materials affiliated with the bills that passed (Table 1), we

found that materials for bills that passed used evidence less (32%) than those that did not pass (58%; P = .01) although this was confounded with policy type, because active living policies passed more often than nutrition- or obesity-related policies in this time period. In a logistic regression model predicting use of

1898 | Research and Practice | Peer Reviewed | Gollust et al.

DISCUSSION Research evidence was cited in just less than half of policy-relevant materials in Minnesota from 2007 to 2011, suggesting that policy stakeholders—advocates and legislators—may see the value of describing evidence to support policy decisions, but a large opportunity exists for research to be cited more frequently. Evidence on policy impact and prevalence of obesity was particularly likely to be cited in these policy discussions. However, other types of information—including stories, political principles or values, and expert beliefs without evidence—were much more likely to be included in policy documents. This supports previous work examining the influences on the policy process: although evidence is recognized as important, other factors particularly germane to the political context of policymaking are major influences on policy decisions.7,24 Previous research examining use of evidence in policymaking has used surveys and interviews to capture policymakers’ perspectives on the extent to which evidence is valued in policymaking.7,11---13 Other recent work has examined the content of 1 type of potentially influential material, obesity-related policy briefs, offering recommendations for improving their design.25 In examining all types of documents and testimony used in 1 particular state context, this study goes beyond previous work. We were also able to disaggregate use of research evidence across different policy types and, in fact, we found some differences by type of policy intervention. Research evidence was more often cited (although not statistically significantly so) in nutrition policy documents than in documents for bills related to active living. This could be because the research evidence connecting nutrition to obesity was more available, better disseminated, or more readily used by the types of advocates and lobbyists involved in nutritionrelated interventions than by those involved in community-level active living or physical activity policy. In other work, we found that research evidence presentation differed within the active living policy domain, with physical

American Journal of Public Health | October 2014, Vol 104, No. 10

RESEARCH AND PRACTICE

TABLE 4—Types of Non–Research-Based Information Cited in Minnesota Obesity-Related Proposed Legislation: 2007–2011 Overall (n = 109), No. (%)

Nutrition (n = 40), No. (%)

100 (91.7)

39 (97.5)

61 (88.4)

.10

33 (33.0)

9 (23.1)

24 (39.3)

.092

Describes expert opinion or testimony

48 (48.0)

17 (43.6)

31 (50.8)

.48

If yes, identifies a professional society or formal group recommendation Mentions cost or cost-effectiveness

22 (45.8) 37 (37.0)

5 (29.4) 17 (43.6)

17 (54.8) 20 (32.8)

.091 .275

Describes a narrative or anecdote about an identifiable individual

.431

Evidence Type Is non–research-based evidence mentioned? Describes public, stakeholder, or constituent opinion

Active Living (n = 69), No. (%)

Differencee by Policy Type,a P

24 (24.0)

11 (28.2)

13 (21.3)

If yes, identifies about self, family, friend, or colleague

7 (7.0)

2 (5.1)

5 (8.2)

.557

If yes, identifies about constituent or community member

2 (2.0)

2 (5.1)

0 (0.0)

.074

If yes, about some other individual If yes, mentions a community, neighborhood, or school If yes, mentions a local business

4 (4.0)

2 (5.1)

2 (3.3)

.645

14 (14.0)

6 (15.4)

8 (13.1)

.75

1 (1.0)

1 (2.6)

0 (0.0)

.209

3 (3.0) 32 (32.0)

2 (5.1) 13 (33.3)

1 (1.6) 19 (31.2)

.318 .819

Mentions legislators, other politicians who support an issue

36 (36.0)

12 (30.8)

24 (39.3)

.384

Mentions policy outcomes in other states

11 (11.0)

4 (10.3)

7 (11.5)

.849

4 (4.0)

4 (10.3)

0 (0.0)

.011

Appeals to emotions or nostalgia Appeals to political principle or ideological orientation Discusses particular political strategies

Mentions analogy to another policy arena (e.g., tobacco) a

Based on the Pearson v2 test.

activity---related policies more commonly citing research evidence than community design--- or urban planning---related policies.26 Our research has indicated that documents utilized in the legislative process use some of the best practices endorsed in the research literature on translating health research to policymaking.7,11---13,25 For instance, the majority of written documents used to support legislative discussions were 2 pages or less. About one fifth used bullet points, and a smaller proportion used tables or figures to convey research evidence. Among the non--research-based information used in the materials, about a quarter used stories or anecdotes. Other researchers have identified the importance of narrative forms of communication in the policy process, suggesting that evidence should be incorporated as much as possible in story form to have more of an impact on policymakers’ attention and interest.13 Given the focus of several pieces of legislation included in this analysis on children and school environments, evidence related to children was often reported in policy materials. However, in spite of the huge proliferation of evidence on the differences in obesity rates by race, ethnicity, socioeconomic status, and geography,27 none of

this research evidence entered the policy discourse we observed. This could suggest that research evidence about disparities was not considered to be the appropriate evidence to motivate stakeholders in Minnesota. Understanding why policy conversations are not engaging with research on health disparities is an important direction for future research.

Limitations Although content analysis methods have value in that they can capture in an objective fashion the prevalence of particular messages and formats in documents and oral testimony, these methods only revealed the observable use of evidence in the policy process (and, indeed, we were further limited to only those materials available in existing archives). Thus, this study does not account for the frequent back-room discussions between legislators and among legislators, aides, staff, advocates, and lobbyists that are quite influential in garnering support for bills. Research evidence may be a salient part of these discussions, but this study does not capture this. In subsequent work, we are conducting qualitative interviews with policy stakeholders to better understand perceptions of use of evidence beyond the observable setting of legislative hearings. In

October 2014, Vol 104, No. 10 | American Journal of Public Health

addition, although our content analysis documented citation of research evidence and type of research evidence, we did not evaluate the quality of research evidence cited. Although we were able to capture whether evidence was incorporated into the material (and whether the source of the evidence was a peer-reviewed study), we cannot ascertain whether this evidence was generated from a well-designed, high-quality study. Yet we know that great variation exists in the quality of research evidence available about obesity.28 Finally, these results are specific to Minnesota and so may not be generalizable to other state legislative contexts. Another limitation concerns the small sample size in our final analytic sample of materials (n = 109), which precluded the application of complex statistical analyses. Given that our materials are nested within a particular legislative event, robust statistical analyses—beyond what we present here—ought to account for clustering and multilevel design. Future research using a larger sample size and capturing additional variation across years or state contexts might leverage these more sophisticated design elements. In addition, we document only descriptive associations between key variables of interest. We are not able to infer any causal relationships

Gollust et al. | Peer Reviewed | Research and Practice | 1899

RESEARCH AND PRACTICE

between evidence presentation and policy outcomes using this descriptive study design.

Conclusions and Policy Implications The growing body of empirical literature on knowledge transfer has suggested that the success of a knowledge translation system depends on designing that system to match the local policymaking context.29 This study indicates that the policy discourse surrounding obesity prevention in Minnesota attends to research but to a limited extent, possibly suggesting additional opportunities for incorporating evidence into the discussion alongside other potentially persuasive strategies, such as presenting public opinion and stories. Of course, just because a document or person testifying presents research does not mean that this evidence is persuasive on its own merit or that it is a quality study devoid of bias or conflict of interest. Future research must attend to the nuances of evidence translation—how it is communicated, when it can be persuasive, and how its users judge its quality.30 By documenting the use of evidence in policy discourse over a 5-year period in 1 state, this study offers baseline data that can be used to design locally tailored evidence communication vehicles and then serve as a benchmark for comparing the extent to which research is part of policy communication in the future. j

About the Authors Sarah E. Gollust is with the Division of Health Policy and Management, University of Minnesota School of Public Health, Minneapolis. At the time of the study, Hanna A. Kite was with the Division of Health Policy and Management, University of Minnesota School of Public Health, and Sara J. Benning was with the Children, Youth, & Family Consortium, University of Minnesota Extension, Minneapolis. Rachel A. Callanan is with the American Heart Association Midwest Affiliate, Edina, MN. Susan R. Weisman is with the Public Health Law Center, William Mitchell College of Law, St. Paul, MN. Marilyn S. Nanney is with the Department of Family Medicine and Community Health and the Program in Health Disparities Research, University of Minnesota Medical School, Minneapolis. Correspondence should be sent to Sarah Gollust, Division of Health Policy and Management, University of Minnesota School of Public Health, 420 Delaware Street SE, MMC 729, Minneapolis, MN 55455 (e-mail: sgollust@umn. edu). Reprints can be ordered at http://www.ajph.org by clicking the “Reprints” link. This article was accepted June 9, 2014.

created data presentations. S. J. Benning, R. A. Callanan, and S. R. Weisman advised about the study design and the policy context, and provided critical revisions to the article. M. S. Nanney helped to secure study funding, contributed to study design and analysis, and provided critical revisions to the article. All authors approved the final version of the article.

Acknowledgments We gratefully acknowledge funding from the Healthy Food, Healthy Lives Institute at the University of Minnesota and the National Institute of Child Health and Human Development (R03HD071156). Previous versions of this article were presented at the Academy Health Annual Research Meeting; June 24, 2013; Baltimore, MD; the Public Health Systems Research Interest Group Meeting; June 25, 2013; Baltimore, MD; and the Association for Public Policy Analysis and Management Fall Research Conference; November 9, 2013; Washington DC.

Human Participant Protection The study was determined to be exempt from human subjects review by the University of Minnesota institutional review board.

References 1. Institute of Medicine. Bridging the Evidence Gap in Obesity Prevention: A Framework to Inform Decision Making. Washington, DC: Institute of Medicine of the National Academies; 2010. 2. Kumanyika S, Brownson RC, Cheadle A. The L.E.A.D. framework: using tools from evidence-based public health to address evidence needs for obesity prevention. Prev Chronic Dis. 2012;9:E125. 3. Ogden CL, Carroll MD, Curtin LR, Kit BK, Flegal KM. Prevalence of childhood and adult obesity in the United States, 2011-2012. JAMA. 2014;311(8):806---814. 4. Centers for Disease Control and Prevention. Progress on childhood obesity: Many states show declines. Available at: http://www.cdc.gov/vitalsigns/childhoodobesity/ 2013. Accessed December 2, 2013. 5. Brownson RC, Chriqui JF, Stamatakis KA. Understanding evidence-based public health policy. Am J Public Health. 2009;99(9):1576---1583. 6. Institute of Medicine. Preventing Childhood Obesity: Health in the Balance. Washington, DC: National Academies Press; 2004. 7. Brownson RC, Royer C, Ewing R, McBride TD. Researchers and policymakers: travelers in parallel universes. Am J Prev Med. 2006;30(2):164---172. 8. Caplan N. The two-communities theory and knowledge utilization. Am Behav Sci. 1979;22(3):459---470. 9. Huberman M. Research utilization: the state of the art. Knowl Policy. 1994;7(4):13---33. 10. Mitton C, Adair C, McKenzie E, Patten S, Perry B. Knowledge transfer and exchange: review and synthesis of the literature. Milbank Q. 2007;85(4):729---768.

Contributors

11. Jewell CJ, Bero LA. “Developing good taste in evidence”: facilitators of and hindrances to evidence informed health policymaking in state government. Milbank Q. 2008;86(2):177---208.

S. E. Gollust led the study design and conceptualization with feedback from all authors, led the data analysis, and drafted the article. H. A. Kite collected the data and

12. Sorian R, Baugh T. Power of information: closing the gap between research and policy. Health Aff (Millwood). 2002;21(2):264---273.

1900 | Research and Practice | Peer Reviewed | Gollust et al.

13. McBride T, Coburn A, MacKinney C, Mueller K, Slifkin R, Wakefield M. Bridging health research and policy: effective dissemination strategies. J Public Health Manag Pract. 2008;14(2):150---154. 14. Amara N, Ouimet M, Landry R. New evidence on instrumental, conceptual, and symbolic utilization of university research in government agencies. Sci Commun. 2004;26(1):75---106. 15. Bowen S, Zwi A. Pathways to “evidence-informed” policy and practice: a framework for action. PLoS Med. 2005;2(7):e166. 16. Feldman PH, Nadash P, Gursen M. Improving communication between researchers and policy makers in long-term care. Gerontologist. 2001;41(3):312---321. 17. Tetroe JM, Graham ID, Foy R, et al. Health research funding agencies’ support and promotion of knowledge translation: an international study. Milbank Q. 2008; 86(1):125---155. 18. Jacobs JA, Dodson EA, Baker EA, Deshpande AD, Brownson RC. Barriers to evidence-based decision making in public health: a national survey of chronic disease practitioners. Public Health Rep. 2010;125(5):736---742. 19. National Conference of State Legislatures. Childhood obesity—2009 update of legislative policy options. Available at: http://www.ncsl.org/research/health/childhoodobesity-2009.aspx. Accessed December 2, 2013. 20. Weber RP. Basic Content Analysis. Newbury Park, CA: Sage; 1990. 21. Altheide DL. Qualitative Media Analysis. Thousand Oaks, CA: Sage; 1996. 22. Olshansky SJ, Passaro DJ, Hershow RC, et al. A potential decline in life expectancy in the United States in the 21st century. N Engl J Med. 2005;352(11):1138---1145. 23. Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977; 33(1):159---174. 24. Kingdon J. Agendas, Alternatives, and Public Policies. 2nd ed. New York, NY: Longman; 2003. 25. Dodson EA, Eyler AA, Chalifour S, Wintrode CG. A review of obesity-themed policy briefs. Am J Prev Med. 2012;43(3 suppl 2):S143---S148. 26. Kite HA, Gollust SE, Callanan RA, Weisman SR, Benning SJ, Nanney MS. Uses of research evidence in the state legislative process to promote active environments in Minnesota. Am J Health Promot. 2014;28(3 suppl): S44---S46. 27. Singh GK, Kogan MD, Van Dyck PC, Siahpush M. Racial/ethnic, socioeconomic, and behavioral determinants of childhood and adolescent obesity in the United States: analyzing independent and joint associations. Ann Epidemiol. 2008;18(9):682---695. 28. Brennan L, Castro S, Brownson RC, Claus J, Orleans CT. Accelerating evidence reviews and broadening evidence standards to identify effective, promising, and emerging policy and environmental strategies for prevention of childhood obesity. Annu Rev Public Health. 2011;32:199-- 223. 29. Contandriopoulos D, Lemire M, Denis J, Tremblay É. Knowledge exchange processes in organizations and policy arenas: a narrative systematic review of the literature. Milbank Q. 2010;88(4):444---483. 30. Brownson RC, Dodson EA, Stamatakis KA, et al. Communicating evidence-based information on cancer prevention to state-level policy makers. J Natl Cancer Inst. 2011;103(4):306---316.

American Journal of Public Health | October 2014, Vol 104, No. 10

Use of research evidence in state policymaking for childhood obesity prevention in Minnesota.

We describe how scientific evidence about obesity has been used in Minnesota legislative materials to understand how research evidence might more effe...
552KB Sizes 1 Downloads 4 Views