RESEARCH ARTICLE

Survey Instruments for Knowledge, Skills, Attitudes and Behaviour Related to Evidence-based Practice in Occupational Therapy: A Systematic Review Helen Buchanan1*†, Nandi Siegfried2 & Jennifer Jelsma1 1

Department of Health & Rehabilitation Sciences, University of Cape Town, Cape Town, South Africa

2

Independent Clinical Epidemiologist, Cape Town, South Africa

Abstract The purpose of this study was to evaluate, through a systematic review, assessment instruments for evidence-based practice (EBP). The specific objectives were to (1) identify survey instruments testing EBP knowledge, skills, attitudes and behaviour; (2) determine the attributes measured by each instrument; (3) evaluate the psychometric properties of the instruments; and (4) evaluate the methodological quality of the instruments. Using the Cochrane approach, searches were conducted in Pubmed, EBSCOHost and Scopus from inception to February 2014. Papers were screened by two independent assessors, and data were extracted by one researcher. Forty papers reporting 34 instruments met the inclusion criteria and were included in the qualitative synthesis. Most instruments measured EBP behaviour (n = 33) and attitudes (n = 21). This review provides a single source of information to enable researchers to select the most robust descriptive instruments to measure EBP learner attributes. Instruments used only with occupational therapists may have resulted in some instruments being missed. For further research, it is recommended that attention is given to developing objective instruments with a focus on knowledge and skills. Copyright © 2015 John Wiley & Sons, Ltd. Received 11 March 2015; Revised 10 June 2015; Accepted 10 June 2015 Keywords evidence-based practice; occupational therapy measurement instruments; research utilization *Correspondence Helen Buchanan, Department of Health & Rehabilitation Sciences, F45 Old Groote Schuur Hospital Building, University of Cape Town, Observatory, 7925, Cape Town, South Africa. †

Email: [email protected]

Published online 3 July 2015 in Wiley Online Library (wileyonlinelibrary.com) DOI: 10.1002/oti.1398

Background Health professionals globally have been encouraged to implement evidence-based practice (EBP). In response to this call, most undergraduate and post-graduate health professional curricula now include modules that enable students to acquire the necessary knowledge, skills, attitudes and behaviour to implement EBP. Continuing professional development programmes have similarly targeted qualified health professionals Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

through educational workshops and seminars. Within occupational therapy, there has been an appeal for the global adoption of EBP with local application to enhance relevance (Illott et al., 2006), but there is insufficient evidence to establish the extent to which this challenge has been taken up. Furthermore, some authors have suggested that EBP is not yet a global reality (Bannigan, 2011; Buchanan, 2011). Surveys of professional groups provide information about the strengths and learning needs of the group 59

Buchanan et al.

Evidence-based Practice Survey Instruments

(Eller et al., 2003), which is required to plan appropriate methods and strategies to increase the uptake of EBP. To date, an occupational therapy survey to determine the extent to which EBP is being implemented internationally has not been carried out. Such an enterprise would be useful not only in measuring the progress of the profession towards becoming evidence-based but also in providing important information for developing further action plans towards realizing this goal. To yield useful findings from a survey, measurement instruments with satisfactory psychometric properties are required to ensure that the relevant attributes are measured and with accuracy. In 2007, a summary ‘of the current state of the art on evaluation of training in [evidence-based health care (EBHC)]’ (Nabulsi et al., 2007, p 469) outlined three domains within the learner (affective, cognitive and behavioural) that should be measured in order to evaluate the outcomes of an EBP training activity comprehensively. Unfortunately, terminology referring to EBP learner outcomes has been used inconsistently in the literature (Nabulsi et al., 2007). As a result, the terms used in this paper will first be defined. The affective domain encompasses ‘attitudes’ that include views, perceptions, beliefs and intentions relating to EBP (Nabulsi et al., 2007). Attitudes may also incorporate ‘… a health professional’s agreement/ acceptance of the evidence, their perceived clinical applicability of the evidence, and their motivation and sense of self-efficacy to adopt EBP’ (Menon et al., 2009, p 1025). The cognitive domain includes the knowledge and skills required to implement the steps of EBP. Knowledge is defined as ‘the acquisition of awareness or facts, data, information, ideas or principles to which one has access through formal or individual study, research, observation, experience or intuition’ (Wojtczak, 2002, p 451). Skills incorporate knowledge ‘by performing EBP steps in some type of clinical scenario, such as with a standardized patient, written case, computer simulation, [objective standardized clinical examination] or direct observation’ (Shaneyfelt et al., 2008, p 1117). Skills require competence in specific areas outside the practice environment, while behaviour encompasses the ‘actual performance of EBP in practice’ (Shaneyfelt et al., 2008, p 1117). Examples of the latter include searching databases for evidence, accessing information sources and using evidence to select an intervention in the actual practice setting. High-quality evidence should be used to inform the choice of an instrument. However, information about 60

instruments that may be used to measure EBP learner outcomes is scattered across a variety of databases and journals making it difficult to obtain a comprehensive picture of the ‘best’ instruments available. Furthermore, some instruments may only be appropriate for undergraduate students, while others may not be applicable to occupational therapy because of fundamental differences in the scope of practice of different health professions. It is currently unclear how EBP learning differs between health professionals. As several occupational therapy-specific survey instruments have been described in the literature, a review and a description of these would prove useful as a single source of information. A systematic review on instruments measuring EBP knowledge and skills in occupational therapists was published in 2010 (Glegg and Holsti, 2010), but a review focussing on instruments to measure all learner outcomes, including behaviour, has not been carried out. Considering the importance of EBP in affirming the contribution of occupational therapy, it is critical that occupational therapists engage actively in implementing evidence in everyday practice. To monitor and measure this process, suitable instruments are needed to (1) describe the current state of the art; (2) plan appropriate interventions to improve EBP knowledge, skills, attitudes and behaviour; and (3) establish the effectiveness of these interventions. To evaluate whether interventions are effective, instruments with satisfactory validity, reliability and clinical utility (Corr and Siddons, 2005) are needed for ongoing monitoring. This review therefore sets out to (1) identify survey instruments to measure EBP knowledge, skills, attitudes and behaviour (use); (2) describe the aspects of EBP learning measured by each instrument; (3) evaluate the psychometric properties of identified instruments; and (4) evaluate the methodological quality of the identified instruments.

Methods Criteria for considering studies for this review Type of study Descriptive studies that used instruments measuring knowledge, skills, attitudes or behaviour related to EBP were included. Systematic reviews, either of Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Buchanan et al.

instruments measuring these aspects or reviews of EBP surveys, were excluded, but the reference lists of included papers were checked to identify studies that may have been missed. Mixed methods studies were included if they contained a quantitative component and complied with the aforementioned criteria.

Evidence-based Practice Survey Instruments

Reference lists of included studies were searched to identify papers that may have been missed.

Data collection and analysis Citations for identified papers were imported into Endnote, and duplicates were removed.

Types of participants Studies had to include participants who were qualified occupational therapists or occupational therapy students (undergraduate or post-graduate). Studies of rehabilitation professionals were included if occupational therapists were part of the sample.

Selection of studies

Studies that measured outcomes related to EBP knowledge, skills, attitudes and behaviour were included. Papers that focussed on these aspects in a specific area of practice (e.g. stroke) rather than EBP in general were excluded.

Two researchers screened paper titles and abstracts and applied the eligibility criteria. Papers that did not meet the eligibility criteria were excluded. Reasons for exclusion were documented. The full text of the remaining studies was retrieved, and the inclusion criteria were applied to identify relevant papers. Once this process had been completed independently, the researchers met to reach consensus on the included and excluded studies. Any discrepancies were discussed until consensus was reached. If the full text paper was not available, it was excluded.

Search methods for identification of studies

Data extraction and management

Using the Cochrane approach, searches were conducted in Pubmed, EBSCOHost and Scopus from their inception to February 2014. The following databases were searched simultaneously in EBSCOHost: Africa-Wide Information, CINAHL, ERIC, Health Source: Nursing/academic edition, MEDLINE, PsycARTICLES and PsycINFO. No limits were set. The search terms used were:

One researcher conducted data extraction of all eligible studies, which were summarized in tables depicting their descriptive characteristics, learner attributes measured and a quality assessment of each instrument.

Aspects of EBP learning measured

(“occupational therapy” OR “occupational therapy practice” OR OT) AND (tool OR survey OR instrument OR test OR measure OR scale OR questionnaire) AND (“evidence-based practice” OR “evidence based practice” OR EBP OR “evidence-based-medicine” OR evidence-based) AND (knowledge OR awareness OR skills OR attitudes OR perceptions OR behaviour OR practice OR ability OR uptake OR implementation OR “research use” OR “research utilisation” OR “research utilization”). Each set of terms was first searched individually after which the results sets were combined. The terms were used as MeSH and as text. No limits were set, and no attempt was made to identify unpublished materials. Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Rating of the methodological quality of included instruments On extracting the information on each instrument, it became clear that the process of development of most instruments did not follow test development procedures, and reporting of psychometric properties was generally of a low standard. Therefore, criteria that were considered fundamental for survey instruments were selected from the COnsensus-based Standards for the selection of health Measurement INstruments checklist (Terwee et al., 2012), and a basic assessment and rating of each property were performed. Aspects to evaluate the clinical utility of the instrument (Corr and Siddons, 2005) were added to the quality rating criteria. Instruments were rated on a 4-point scale on seven aspects relating to their measurement properties. The criteria used in the quality evaluation are available in Table I. Properties that were not reported were classified as ‘unclear’. 61

Buchanan et al.

Evidence-based Practice Survey Instruments

Table I. Criteria used for the quality evaluation based on the COSMIN checklist (Terwee et al., 2012) Rating scale Measurement property

Excellent (E)

Internal consistency: Was an internal consistency statistic calculated for each sub-scale separately? Reliability (intra-rater reliability): Was an intraclass correlation coefficient (for continuous variables) or a Kappa (for dichotomous/nominal/ ordinal variables) calculated? Content validity: Was there an assessment of whether all items refer to relevant aspects of the construct being measured? Structural validity: Was exploratory or confirmatory factor analysis performed? Hypothesis testing: Were hypotheses regarding correlations or mean differences formulated a priori? Cross-cultural validity: • Were both the original language in which the instrument was developed and the language in which it was translated described? • Did the translators work independently of each other? • Were items translated forward and backward?

Internal consistency statistic calculated for each sub-scale separately ICC or Kappa calculated

Clinical utility: Were the clarity of the instructions, the format /acceptability of the instrument and the time taken to complete the questionnaire tested?

The clarity of instructions, format/ acceptability of the instrument and time taken to complete were tested

Good (G)

Fair (F)

Internal consistency statistic NOT calculated for each sub-scale separately No ICC/Pearson/ Spearman correlations calculated. No Kappa calculated – only percentage agreement

Poor description of the relevant aspects of the construct being measured

Assessed if all items refer to relevant aspects of the construct being measured Exploratory or confirmatory factor analysis performed Multiple hypotheses formulated a priori

Multiple forward and backward translations

Did not assess if all items refer to the relevant aspects of the construct being measured No exploratory or confirmatory factor analysis performed

Minimal number of hypotheses formulated a priori

Hypotheses vague or not formulated but possible to deduce what was expected

Both source language and target language described

Translators worked independently

Poor (P)

Unclear what was expected

Source language not known

Assumable that translators worked independently Multiple forward but one backward translation Two of the described aspects of utility were tested

Unclear whether translators worked independently One forward and one backward translation One of the described aspects of utility were tested

Translators did not work independently Only a forward translation

No mention was made of any aspects of utility

COSMIN, consensus-based standards for the selection of health measurement instruments; ICC, intraclass correlation coefficient.

Results The search identified 50 citations in Pubmed, 277 in EBSCOHost and 68 in Scopus. After removing duplicates, 351 references were imported into Endnote. Further duplicates were identified in Endnote and 62

removed, leaving 326 citations. These were independently screened by two reviewers, and 286 papers were excluded based on the title and abstract. The full text of the remaining 40 papers was screened for eligibility. Thirteen papers were excluded on study type, two of which were systematic reviews, and one was excluded Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Buchanan et al.

as the full text was not available. The original studies included in the systematic reviews (n = 2) and the reference lists of the included papers (n = 26) were screened. A further 14 papers met the inclusion criteria, bringing the total to 40 papers. As some papers (n = 5) reported different aspects of a single study, the 40 papers accounted for 35 studies. The PRISMA flow diagram for the study (Moher et al., 2009) is shown in Figure 1.

Description of the studies Of the 35 studies reported in the included papers, most designs were cross-sectional surveys (n = 33), one of which was a follow-up study in which the same survey

Evidence-based Practice Survey Instruments

was sent to the same group of occupational therapists 6 years apart to determine whether attitudes had changed over time (Karlsson and Törnquist, 2007). Two further studies described the development of an instrument (Salbach and Jaglal, 2011; Upton and Lewis, 1998). In the 33 cross-sectional studies, surveys were distributed by post (n = 22), hard copy (n = 4), email (n = 2), email and post (n = 1) and online (n = 2). The distribution method was unclear in two studies (Closs and Lewin, 1998; Heiwe et al., 2011). Details of the included studies are summarized in Table II. Studies are reported in alphabetical order by author and then publication year. Only the outcomes of relevance to this review (knowledge, skills, attitudes and behaviour) are included.

Figure 1 Study flow diagram

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

63

64

Research Knowledge Attitudes and Practices of Research (KAP) Survey (Van Mullem et al., 1999)

Developed for study modelled on existing survey

Bennett et al. (2007)

Brown # et al. (2009)

Modified from McColl et al. (1998)

Bennett et al. (2003); Pettingill et al. (1994)

Source

Instrument and developer/s (where applicable)

Qualified OTs listed on national OT databases who worked with children (n = 696)

Users of the OTseeker evidence database (included 447 qualified OTs (n = 250) and students (n = 197))

Multi-national (41 countries)

Australia, Taiwan and UK

Members of OT Australia (n = 649)

Participants (no. of respondents)

Australia

Place

Table II. Characteristics of included papers (n = 34)

Access to OTseeker at work Frequency and main reason for use Use of other databases Perception whether OTseeker improved ability to find relevant evidence Contribution to changes in practice NR

Cross-sectional (online)

Cross-sectional (postal)

Attitudes Use Confidence in skills Barriers to use

EBP attributes measured

Cross-sectional (postal)

Study design (distribution method)

NR

Self-report containing 4 sections with 5-point rating scales, fixedresponse categories and an open-ended question Self-report online questionnaire with 14 questions, mainly dichotomous or fixed-response categories and an open-ended question

Instrument construction

Content validity: 0.84 (Van Mullem et al., 1999)

NR

Content validity tested in a pilot

Validity

Test–retest reliability for subscales: 0.77–0.83 Internal consistency: α = 0.93–0.97 (Van Mullem et al., 1999); α = 0.95–0.97 (Eller et al., 2003)

NR

NR

Reliability

(Continues)

Easy and inexpensive (Van Mullem et al., 1999)

Trialled with clinicians and minor changes made

Feedback on acceptability and clarity obtained in a pilot test

Clinical utility

Evidence-based Practice Survey Instruments Buchanan et al.

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Brown et al. # (2010a)

Source

Research Knowledge Attitudes and Practices of Research (KAP) Survey (Van Mullem et al., 1999) Edmonton Research Orientation Survey (EROS; Pain et al., 1996) Barriers to Research Utilization Scale (BARRIERS; Funk et al., 1991)

Barriers to Research Utilization Scale (BARRIERS; Funk et al., 1991)

Edmonton Research Orientation Survey (EROS; Pain et al., 1996)

Instrument and developer/s (where applicable)

Table II. (Continued)

As for Brown et al. (2009)

Place

As for Brown et al. (2009)

Participants (no. of respondents)

As for Brown et al. (2009)

Study design (distribution method)

As for Brown et al. (2009)

As for Brown et al. (2009)

As for Brown et al. (2009)

As for Brown et al. (2009)

As for Brown et al. (2009)

As for Brown et al. (2009)

As for Brown et al. (2009)

As for Brown et al. (2009)

As for Brown et al. (2009)

As for Brown et al. (2009)

(Continues)

As for Brown et al. (2009)

As for Brown et al. (2009)

As for Brown et al. (2009)

NR Internal consistency: 0.72 to 0.80 for the first 3 factors and 0.65 for the 4th factor Test–retest reliability: preliminary evidence (Funk et al., 1991)

Sub-scales confirmed by factor analysis (Funk et al., 1991) Factor analysis identified 4 factors similar to those of Funk et al. but only 22 of the 28 items were retained As for Brown et al. (2009) 4 sub-scales and 28 items rated on a 4point scale

As for Brown et al. (2009)

NR

As for Pain et al. (2004)

As for Pain et al. (2004)

Clinical utility

38 items rated on a 5-point Likert scale; an overall score and sub-scale scores are calculated

Reliability

Participation in research and research orientation to practice – the EBP sub-scale indicates research utilization Perceived barriers to research utilization

Validity

Instrument construction

EBP attributes measured

Buchanan et al. Evidence-based Practice Survey Instruments

65

66

Questionnaire based on Humphris et al. (2000)

Barriers to Research Utilization Scale (BARRIERS; Funk et al., 1991) and additional questions on barriers and facilitators

Closs and Lewin (1998)

Research Knowledge Attitudes and Practices of Research (KAP) Survey (Van Mullem et al., 1999) Questionnaire developed for study

Cameron et al. (2005)

Caldwell et al. (2007)

Brown et al. # (2010b)

Source

Instrument and developer/s (where applicable)

Table II. (Continued)

OT graduates from 3 universities (n = 50)

Members of the American Occupational Therapy Association (n = 131)

Dieticians (n = 12), OTs (n = 24), physiotherapists (n = 51) and speech therapists (n = 15)

United States and Puerto Rico

UK

As for Brown et al. (2009)

London, UK

As for Brown et al. (2009)

Place

Participants (no. of respondents)

Cross-sectional (unclear)

Perceived barriers to research utilization

Preparation and skills training for EBP, access to databases, experience of research changing practice, views on EBP, confidence in EBP Application of EBP to intervention and attitudes towards EBP

Cross-sectional (postal)

Cross-sectional (postal)

As for Brown et al. (2009)

EBP attributes measured

As for Brown et al. (2009)

Study design (distribution method)

As for Brown et al. (2009)

NR

Questionnaire evaluated by 3 experts Factor analysis of pilot study data (n = 30) – noncontributing items removed As for Brown et al. (2009) Self-report questionnaire with 2 sections – demographic information and 9 questions rated on a 5-point scale 28 items with 5 response options

NR

Literature used to develop questionnaire

Self-report with closed questions and 5-point Likert scales

As for Brown et al. (2009)

Reliability

As for Brown et al. (2009)

Validity

As for Brown et al. (2009)

Instrument construction

(Continues)

Wording modified for UK context based on feedback from pilot

NR

Minor changes to wording based on pilot study (n = 20)

As for Brown et al. (2009)

Clinical utility

Evidence-based Practice Survey Instruments Buchanan et al.

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Self-report questionnaire containing 6 sections. Formats for questions included tick box options, 5-point Likert scales and open-ended Self-report questionnaire with a mixture of open, closed and Likert scale items

Self-report questionnaire with 7-point or 5-point Likert scale response options

Access to and use of research information

Views and perceptions of EBP, self-rated involvement in types of EBP activities

Frequency of using evidence sources, barriers to implementing EBP and attitudes to EBP

Cross-sectional (email)

Cross-sectional (postal)

Cross-sectional (email and postal)

OTs who supervised students (n = 500)

Members of the Dutch Association of Occupational Therapists (n = 100)

South West and South East England and Channel Islands, UK

Netherlands

Questionnaire developed for the study

Questionnaire based on Humphris et al. (2000); Parahoo (2000) and Dysart and Tomlin (2002)

Curtin and Jaramazovic (2001)

Dopp et al. (2012)

Instrument construction

OTs in 2 UK Councils with Social Services Responsibilities (not disaggregated from social workers)

EBP attributes measured

North England, UK

Study design (distribution method)

Questionnaire developed for study

Place

Participants (no. of respondents)

Cooke et al. (2008)

Source

Instrument and developer/s (where applicable)

Table II. (Continued)

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

(Continues)

Format evaluated in a pilot study (n = 6)

Cognitive interviews to check thoughts about questionnaire

No reliability testing performed

Content based on focus group discussions and previous research Draft questionnaire tested in cognitive interviews, revised and then tested in a postal pilot study Content validity assumed as based on focus group findings No validity testing performed Content evaluated in a pilot study (n = 6) Face validity supported by expert opinion

Internal consistency good – use of sources: α = 0.789; barriers: α = 0.795; attitudes: α = 0.783

Piloted on a small number of staff and changes made before use – changes not specified

NR

NR

Clinical utility

Reliability

Validity

Buchanan et al. Evidence-based Practice Survey Instruments

67

68

Questionnaire developed for the study

Research Knowledge Attitudes and Practices of Research (KAP) Survey (Van Mullem et al., 1999)

Dysart and Tomlin (2002)

Eller et al. (2003)

Source

Instrument and developer/s (where applicable)

Table II. (Continued)

Random sample of members of the American Occupational Therapy Association (n = 209)

Nurse (n = 538) and non-nurse health professionals (n = 208) including OTs

United States

New Jersey, United States

Place

Participants (no. of respondents)

Cross-sectional (hard copy)

Cross-sectional (postal)

Study design (distribution method) EBP skills and attitudes, facilitators and barriers to EBP, access to and frequency of using EBP resources, frequency of research implementation Knowledge, willingness to engage in (attitudes) and ability to perform (practices) activities related to: identifying clinical problems, establishing current best practice, implementing research in practice and communicating research

EBP attributes measured

NR

NR Construct validity: factor analysis revealed 5 factors

33-item self-report consisting of 5 factors; items rated on a 3-point scale and sub-scale scores determined

(Continues)

Piloted with 6 OTs and revisions made (details not stated) Wording of some items may have been confusing

NR

NR

Self-report with 3 sections containing dichotomous, 4point scale, 5-point Likert scale and openended response options

Clinical utility

Reliability

Validity

Instrument construction

Evidence-based Practice Survey Instruments Buchanan et al.

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Questionnaire designed for the study

Adapted from Bennett et al. (2003)

Gosling and Westbrook (2004)

Graham et al. (2013)

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

New Zealand

New South Wales, Australia

Adapted from Powell Oregon, United and Case-Smith (2010) States

Place

Gilman (2011)

Source

Instrument and developer/s (where applicable)

Table II. (Continued)

Allied health professionals at randomly selected hospitals (physiotherapists [n = 228], OTs [n = 118], speech pathologists [n = 77], dieticians [n = 78], clinical psychologists [n = 59], pharmacists [n = 84] and social workers [n = 146]) OTs with practice certificates who consented to be contacted (n = 473)

Recent OT Masters graduates (n = 26)

Participants (no. of respondents)

Cross-sectional (postal)

Cross-sectional (hard copy)

Cross-sectional (online)

Study design (distribution method) Instrument construction

Perceptions of skills Self-report Attitudes questionnaire with Behaviour 5-point Likert scale, categorical and open-ended questions

As for Powell and Case-Smith (2010) with minor wording changes, additional response options included for 2 items and 4 new items added 25-item Use of an online questionnaire, evidence database, comprising closesearching skills, ended questions success finding evidence, impact on with various response options, e. clinical practice, g. yes/no, 3–6 point perceived barriers rating scales to using the database

As for Powell and Case-Smith (2010)

EBP attributes measured

NR

NR

NR

NR

NR

NR

Input from academics but no validation performed

(Continues)

Cognitive interviews used to improve clarity

Minor changes made to wording of some questions to improve clarity

Clinical utility

Reliability

Validity

Buchanan et al. Evidence-based Practice Survey Instruments

69

70

Questionnaire based on a qualitative study and Pettingill et al. (1994)

Humphris et al. (2000)

Adaptation and translation of an instrument by Eckerling et al. (1988)

Questionnaire developed for study

Hu et al. (2012)

Karlsson and Törnquist (2007)

Questionnaire translated from Jette et al. (2003) with some modifications

Heiwe et al. (2011)

Source

Instrument and developer/s (where applicable)

Table II. (Continued)

OTs registered with the Swedish Occupational Therapy Association (n = 425 at baseline and n = 442 at follow-up)

OTs in 7 Acute NHS Trusts in South Thames region (n = 66)

South Thames region, UK

Two central county districts in Sweden

OTs working in rural/remote areas in an NHS Trust (n = 64)

OTs employed in a large university hospital (n = 57)

UK

Sweden

Place

Participants (no. of respondents)

Attitudes towards and involvement in EBP

Research availability and utilization, attitudes towards research and barriers and facilitators to EBP Perceptions of attitudes to, perceived ability to perform and current involvement in research-related activities

Cross-sectional (postal)

Cross-sectional (postal)

Follow-up (postal)

Attitudes, beliefs, knowledge and behaviour towards EBP

EBP attributes measured

Cross-sectional (unclear)

Study design (distribution method) Self-report questionnaire comprising 5 sections and 51 statements. Most items rated on a 5-point Likert scale Self-report questionnaire consisting of 4 sections with yes/no and 5-point Likert Scale response options Self-report questionnaire with 4 sections and items with yes/no and 5-point Likert scale response options Self-report; 5 dimensions of research-related activities each containing 4 statements rated on a 5-point Likert scale

Instrument construction

NR

NR Content validity established through professional opinion

(Continues)

Trial version completed by occupational therapists and students not involved in the study (n = 30) and minor changes made to instructions and wording

NR

NR

Based on a literature review

Internal consistency for 4 dimensions: α = 0.87–0.81 (Eckerling et al., 1988) and α = 0.79– 0.62 (Ehrenfeldt and Eckerling, 1991). Present study: α = 0.82 (role), α = .62 (ability), α = 0.53 (intent), and α = 0.80 (engagement)

NR

NR

Face validity established with a group of experts

Rigorous 3-step translation process followed 4 original dimensions and 4 research activities chosen based on extensive literature review conducted by Ehrenfeldt and Eckerling (1991). Validity not explicitly tested for this study

Clinical utility

Reliability

Validity

Evidence-based Practice Survey Instruments Buchanan et al.

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Lyons et al. # (2010)

Source

Research Knowledge Attitudes and Practices of Research (KAP) Survey (Van Mullem et al., 1999) Barriers to Research Utilization Scale (BARRIERS; Funk et al., 1991) Edmonton Research Orientation Survey (EROS; Pain et al., 1996)

Instrument and developer/s (where applicable)

Table II. (Continued)

UK

Place

Qualified paediatric OTs on the College of Occupational Therapists’ database (n = 145)

Participants (no. of respondents)

As for Brown et al. (2009)

Study design (distribution method) Attitude to research-related activities rated on a 7-point scale Questions were added to measure actual research engagement as facilitators for research, proficiency in reading scientific papers and use of scientific journals As for Brown et al. (2009)

Intention to engage in research-related activities in the future

As for Brown et al. (2009)

Instrument construction

EBP attributes measured

As for Brown et al. (2009)

Validity

As for Brown et al. (2009)

Reliability

(Continues)

As for Brown et al. (2009)

Clinical utility

Buchanan et al. Evidence-based Practice Survey Instruments

71

72

McKenna et al. (2005)

McCluskey (2003)

Lyons et al. # (2011)

Source

Questionnaire based on 2 previous surveys (McCluskey (2003))

Research Knowledge Attitudes and Practices of Research (KAP) Survey (Van Mullem et al., 1999) Barriers to Research Utilization Scale (BARRIERS; Funk et al., 1991) Edmonton Research Orientation Survey (EROS; Pain et al., 1996) Adapted from Upton and Lewis (1998)

Instrument and developer/s (where applicable)

Table II. (Continued)

Australia

New South Wales, Australia

Australia

Place

OTs from all States and territories and those at 95 facilities in Queensland and New South Wales (n = 213)

OTs attending an EBP workshop (n = 67)

Qualified paediatric OTs (n = 138)

Participants (no. of respondents)

Knowledge, skills and attitudes related to EBP, use of EBP, perceived barriers to EBP and solutions Use and perceptions of an online database (OTseeker) and its impact on knowledge and practice

Cross-sectional (hard copy)

Cross-sectional (postal)

As for Brown et al. (2009)

EBP attributes measured

As for Brown et al. (2009)

Study design (distribution method)

Self-report with 21 items containing tick box or 3-point scale response options and openended questions Demographic information and 2 sections: use of OTseeker and its impact on practice; and perceptions of the utility of OTseeker. Response options included dichotomous, multiple fixedresponse categories and 10-point rating scales

As for Brown et al. (2009)

Instrument construction Validity

NR

NR

As for Brown et al. (2009)

Reliability

NR

NR

As for Brown et al. (2009)

Clinical utility

(Continues)

Questionnaire piloted on occupational therapists who were not members of OT Australia (n = 3) and minor changes made to improve the clarity of some questions

Feedback on wording, layout and response options; 10 minutes to complete

As for Brown et al. (2009)

Evidence-based Practice Survey Instruments Buchanan et al.

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

General Use of Research (adapted from Varcoe and Hilton (1995))

Pain et al. (2004)

Knowledge Acquisition Survey developed for the study Individual semistructured interviews

Barriers and Attitudes to Research in the Therapies (BART) derived from previous studies (Closs and Lewin, 1998; Metcalfe et al., 2000)

Metcalfe et al. (2001)

Source

Instrument and developer/s (where applicable)

Table II. (Continued)

OTs registered with the Council for Professions Supplementary to Medicine (n = 159)

OTs in 2 large urban and 2 rural/small urban areas (n = 58)

Northern and Yorkshire regions, UK

Province in the West, Canada

Place

Participants (no. of respondents) Attitudes and barriers to research

Self-rated use of research

Cross-sectional (postal)

Cross-sectional (postal)

Availability and use of information sources Use of research in practice

EBP attributes measured

Study design (distribution method)

Research use rated on a 7-point scale by participants and interviewer

Self-report questionnaire with 3 sections: perceived importance of research (PIR; 7 questions) and perceived barriers (PB; 22 questions). Scores for sections 2 and 3 ranged from 7 to +7 and 22 to +22, respectively. No further details provided. 10 self-report items rated on a 4-point scale and summed to give an overall total Self-report. No other details stated

Instrument construction

NR

IRR ranged from 0.80 to 0.91

(Continues)

NR

NR

NR Internal consistency: α = 0.87 Established content validity with peer review (Varcoe and Hilton, 1995)

NR

NR

NR

Clinical utility

Reliability Good internal consistency for the perceived importance of research scale (α = 0.63) and high internal consistency for the perceived barriers scale (α = 0.78)

Validity Factor analysis revealed 2 factors in the PIR sub-scale and 6 factors in the PB sub-scale

Buchanan et al. Evidence-based Practice Survey Instruments

73

74

Source

Edmonton Research Orientation Survey (EROS; Pain et al., 1996)

Instrument and developer/s (where applicable)

Table II. (Continued)

Place

Participants (no. of respondents)

Study design (distribution method) Self-report 2-part questionnaire consisting of 38 items rated on a 5-point Likert scale

Self-rated knowledge of research concepts

Participation in research and research orientation to practice – the EBP sub-scale indicates research utilization

Instrument construction

EBP attributes measured Validity

Principal component analysis – 4 factors Significant relationship between higher scores and higher education levels, and research participation and training Overall mean correlation with other research participation indicators: 0.56

Scores correlated with research involvement and training levels

Evidence of construct validity

Reliability Internal consistency: α = 0.93

(Continues)

NR

Clinical utility

Evidence-based Practice Survey Instruments Buchanan et al.

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Questionnaire based on similar studies in other professionals (e.g. Kirk et al. (1976); McKee et al. (1987))

Questionnaire developed for study

Philibert et al. (2003)

Pollock et al. (2000)

Source

Instrument and developer/s (where applicable)

Table II. (Continued)

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Scotland, UK

United States

Place

Stroke rehabilitation professionals (physiotherapists [n = 27], OTs [n = 26], nurses [n = 22], speech and language therapists [n = 6] and other professionals [n = 5])

OTs with AOTA membership in 5 States (n = 328)

Participants (no. of respondents) Sources of knowledge guiding practice Attitudes towards and use of research in practice

Confidence in reading and understanding literature, conducting searches and appraisal, attitudes to EBP, perceived facilitators to EBP and reported use of EBP

Cross-sectional (postal)

EBP attributes measured

Cross-sectional (postal)

Study design (distribution method)

Self-report containing 20 statements classified under ‘ability’ (4 statements), ‘opportunity’ (8 statements) and ‘implementation’ (8 statements). Level of agreement with each statement rated on a 5-point Likert scale

Self-report questionnaire containing 4 sections. Items included Likerttype scales with varying response options and 2 open-ended questions

Instrument construction Item-total correlations for attitude items showed a significant relationship to the total score (r ≥ 0.56, p < 0.001; Kirk et al., 1976) Content validity was tested in a pilot study (n = 5) Barriers identified in 4 focus groups formed into statements by an 2 independent assessors, then combined and independently categorized by 4 assessors and followed by a consensus process

Validity

NR

Internal consistency for attitudes towards and use of research were 0.89 and for 0.78, respectively

Reliability

(Continues)

NR

Minor modifications based on pilot study (n = 31) to establish the clarity of questions and opinions about the questionnaire in general

Clinical utility

Buchanan et al. Evidence-based Practice Survey Instruments

75

76 United States

United States

Questionnaire developed for the study

Questionnaire developed for study

As aforementioned

Powell and Case-Smith (2003)

Powell and Case-Smith (2010)

North-West region, UK

Place

Pomeroy et al. (2003)

Source

Instrument and developer/s (where applicable)

Table II. (Continued)

Graduates of an OT Masters programme at Ohio State University (n = 43)

Graduates of an OT Bachelors programme at Ohio State University (n = 85)

Physiotherapists (n = 7), nurses (n = 2), OTs (n = 2), speech and language therapist (n = 2)

Participants (no. of respondents)

Cross-sectional (email)

Cross-sectional (postal)

Cross-sectional (postal)

Study design (distribution method) EBP knowledge and skills (formulating a clinical question, searching for evidence, critical appraisal, synthesizing information, understanding statistics, extracting clinical information from papers and evaluating own practice) Information needs, use of online databases, success finding information, ability to analyse and apply information in practice As for Powell and Case-Smith (2003)

EBP attributes measured

NR

NR

NR Self-report questionnaire containing tick box, yes/no and open-ended questions

NR

NR

NR

Self-report on changes in knowledge and skills related to EBP rated on a 5-point scale and an open-ended question on involvement in EBP

As for Powell and Case-Smith (2003)

Reliability

Validity

Instrument construction

(Continues)

NR

NR

NR

Clinical utility

Evidence-based Practice Survey Instruments Buchanan et al.

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Evidence-based Practice Confidence (EPIC) Scale

Questionnaire developed for the study from Jette et al. (2003) and Dysart and Tomlin (2002)

Salbach and Jaglal (2011)

Salls et al. (2009)

Source

Instrument and developer/s (where applicable)

Table II. (Continued)

Qualified OTs

AOTA members and nonmembers licenced to practice in Pennsylvania (n = 930)

Toronto, Canada

United States

Place

Participants (no. of respondents) Confidence in ability to implement EBP (self-efficacy)

Attitudes, knowledge and use of EBP

Cross-sectional (postal)

EBP attributes measured

Step-wise process to develop and validate the questionnaire

Study design (distribution method)

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

NR Questionnaire developed with input from an expert reviewer

(Continues)

Revised scale tested in cognitive interviews to check comprehensibility after which minor changes were made to wording of instructions and some items Pilot tested with a convenience sample of occupational therapists (n = 10) and modified based on their feedback Wording of some questions may have been confusing or prompted responses in a particular manner

NR

Content of scale based on literature after which face and content validity were evaluated by experts in the field of EBP

11-item self-report scale with 11 response options per item (ranging from no confidence [0%] to completely confident [100%])

Four sections: access to EBP resources (5 questions), frequency of using EBP resources (4 questions), knowledge of and attitudes towards EBP (13 questions using a 4-point rating scale) and perceived supports and barriers to EBP (2 questions with unordered response choices)

Clinical utility

Reliability

Validity

Instrument construction

Buchanan et al. Evidence-based Practice Survey Instruments

77

78

Modified Knowledge, Attitude and Behaviour (KAB) questionnaire (Johnston et al., 2003)

Questionnaire designed for study

Stronge and Cahill (2012)

Sweetland and Craik (2001)

Source

Instrument and developer/s (where applicable)

Table II. (Continued)

UK

Ireland

Place

OTs registered with the National Association of Neurological Occupational Therapists (NANOT; n = 125)

Final year OT students at the 4 universities (n = 86)

Participants (no. of respondents) Self-rated knowledge, attitudes to and future use of EBP

Frequency of using EBP and factors influencing its use

Cross-sectional (postal)

EBP attributes measured

Cross-sectional (hard copy)

Study design (distribution method)

Self-report using closed questions with scaled responses and tick box formats.

Self-report containing subjective and objective questions. Consists of 4 subscales: knowledge (5 items), attitudes (6 items) and future use of EBP (9 items) rated on a 6point scale and 17 additional questions on sources of evidence and demographics

Instrument construction

Internal consistency: α = 0.71–0.88 Effect size of 0.33 (p < 0.01) for increase in EBP knowledge in 2nd year medical students at 8 months (Johnston et al., 2003) Construct validity demonstrated through correlations with other EBP measures Concurrent validity demonstrated (Johnston et al., 2003) Factor analysis revealed four distinct areas (knowledge, attitudes, personal application and use and future use) Questionnaire based on information from literature and interviews with occupational therapy stroke experts

NR

Reliability

Validity

(Continues)

Piloted on NANOT committee members and regional contacts (n = 20) – minor modifications made Recommendations after the study were improvement of wording of some questions and clearer definitions of levels of evidence for frequency of using evidence

10 minutes to complete

Clinical utility

Evidence-based Practice Survey Instruments Buchanan et al.

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Questionnaire by Upton and Lewis (1998)

As for Upton (1999a)

Upton (1999a) *#

Upton (1999b) *# Upton and Upton (2006)

Questionnaire developed from Upton and Lewis (1998)

Questionnaire on EBP and clinical effectiveness

Upton and Lewis (1998)

Source

Instrument and developer/s (where applicable)

Table II. (Continued)

As for Upton (1999a) UK

Wales, UK

UK

Place

Podiatrists (n = 38), OTs (n = 84), physiotherapists (n = 135) and speech therapists (n = 38) As for Upton (1999a) Allied health professions and health science services including OTs (n = 86)

NA

Participants (no. of respondents)

As for Upton and Lewis (1998) As for Upton and Lewis (1998)

As for Upton and Lewis (1998)

Cross-sectional (postal)

Cross-sectional (postal) Cross-sectional (postal)

Five sections with varied response formats (visual analogue scales, semantic differentials, Likert-type scales) and a section for open comments

Perceived knowledge of EBP and its individual steps, frequency of completing EBP steps, attitudes to EBP and barriers and solutions to implementing EBP

Questionnaire development and validation

As for Upton and Lewis (1998) Self-report containing items rated with a visual analogue scale, 7-point scale or 5-point scale

As for Upton and Lewis (1998)

Instrument construction

EBP attributes measured

Study design (distribution method)

As for Upton and Lewis (1998) As for Upton and Lewis (1998)

As for Upton and Lewis (1998) As for Upton and Lewis (1998)

As for Upton and Lewis (1998)

(Continues)

As for Upton and Lewis (1998) As for Upton and Lewis (1998)

As for Upton and Lewis (1998)

Clinical utility Feedback from pilot to improve design, clarity of instructions and wording; lengthy

Reliability Test–retest reliability: 0.8–0.92 Internal consistency: α = 0.74–0.88

Validity Face validity: high (items informed by literature and clinician interviews) Content validity: good (based on expert discussion and pilot studies) Criterion validity: no external reference for comparison As for Upton and Lewis (1998)

Buchanan et al. Evidence-based Practice Survey Instruments

79

80

Edmonton Research Orientation Survey (EROS; Pain et al., 1996)

Alberta, Canada

Place Members of the Alberta Association of Registered OTs (n = 293)

Participants (no. of respondents)

Articles report different aspects of a single study.

Cross-sectional (postal)

Study design (distribution method) EBP attributes measured As for Pain et al. (2004) Perceived barriers to research Resources available to support research

EBP, evidence-based practice; OTs, occupational therapists; NA, not applicable; NR, no data reported.

*#Articles report different aspects of a single study.

#

*Number of OTs not disaggregated from other (non-nursing) professions.

Waine et al. (1997)

Source

Instrument and developer/s (where applicable)

Table II. (Continued)

As for Pain et al. (2004) Added a 3rd section with 6 items on perceived barriers to research rated on a 5-point Likert scale and an item on resources available to support research

Instrument construction

Validity data for EROS determined on an earlier version of the scale No validity data on barriers items

Validity

Clinical utility NR

Reliability Reliability data for EROS determined on an earlier version of the scale No reliability data on barriers items

Evidence-based Practice Survey Instruments Buchanan et al.

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Buchanan et al.

Location of studies Studies were conducted in nine countries with two multi-national studies. Most were undertaken in the UK (n = 14), followed by the United States (n = 8), Australia (n = 5), Canada (n = 3) and Sweden (n = 2). One study each was carried out in Ireland, the Netherlands, New Zealand, Puerto Rico and Taiwan.

Evidence-based Practice Survey Instruments

Only eight instruments (n = 34) had three or more properties rated as excellent. These are summarized in Table VI. All eight instruments measured aspects of EBP behaviour, while five measured aspects of knowledge and attitudes. Only one instrument measured EBP skills.

Discussion Description of instruments The included papers used 34 instruments, all of which were self-reports. Two studies incorporated more than one instrument. Nine studies used existing instruments, while the remainder either developed new ones (n = 15) or modified existing instruments to meet the study aims and objectives (n = 16). Four instruments were used in more than one study – the Knowledge Attitudes and Practice of Research survey (Van Mullem et al., 1999; two studies), the BARRIERS scale (Funk et al., 1991; two studies), the Edmonton Research Orientation scale (Pain et al., 1996; three studies) and the questionnaire developed by Powell and Case-Smith (2003; two studies).

Aspects of EBP learning measured Instruments were classified by a researcher using the definitions for learner outcomes (knowledge, skills, attitudes and behaviour) presented at the beginning of this paper. Table III shows each instrument with the attribute/s measured. Most instruments measured EBP behaviour (n = 33) and attitudes (n = 21), while relatively few measured knowledge (n = 8) or skills (n = 3).

Methodological quality of included studies The quality ratings of included instruments are shown in Table IV. The number of instruments rated per category of the quality rating scale for each measurement property is shown in Table V. In general, there was limited consideration of measurement properties. Eighteen instruments reported no validity and reliability testing at all. In the cases where instruments were modified for a study, the researchers assumed that previously established properties still applied and many failed to test validity in the study setting. Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

This systematic review includes 35 studies that used 34 instruments. In general, testing of the psychometric properties of the included instruments was inadequate. Only nine instruments had at least one aspect of validity and reliability tested. This variability in the quality of instruments was similarly found in a systematic review of instruments measuring EBP knowledge and skills in occupational therapists (Glegg and Holsti, 2010). If instruments have not demonstrated their validity, the study results are questionable because an instrument needs to be reliable and valid to generate accurate results (Kielhofner, 2006). Validity and reliability are interconnected; an unreliable instrument cannot be valid, and a reliable instrument is not necessarily valid because, although it may produce consistent results, it may not be measuring what is intended (Kielhofner, 2006). As expected, the instruments that had undergone more investigation of their psychometric properties were those that were used in more than one study. This finding supports Bowling’s (2009) recommendation to use existing instruments rather than develop new ones. However, as the properties of an instrument are contextually dependent, validity and reliability need to be tested in the setting in which the study will be conducted even when validated and reliable instruments are used without changes (Boynton and Greenhalgh, 2004; Streiner and Norman, 2008). This is particularly important considering that all the instruments used in the included studies were developed and tested in high income countries, which may differ substantially from middle-income and low-income contexts. Many studies modified instruments used in previous studies. It is concerning that where this was carried out, the researchers assumed that the properties of the original instrument still applied. Ideally, a previously validated instrument should be used in its original form, but this may not be possible if the context in which the instrument was developed differs from that in 81

Buchanan et al.

Evidence-based Practice Survey Instruments

Table III. Instruments from included studies classified by EBP learner attribute (n = 34) Source and name of instrument (if applicable) Bennett et al. (2003)

EBP knowledge

Skills in EBP

Attitudes to EBP Attitudes Confidence in skills

Bennett et al. (2007)

Caldwell et al. (2007)

Cameron et al. (2005)

Views on relevance and key aspects of EBP Confidence to engage in EBP Value of EBP

Cooke et al. (2008) Curtin and Jaramazovic (2001) Dopp et al. (2012)

Views and perceptions of EBP Attitudes to EBP

Dysart and Tomlin (2002)

Attitudes towards research findings and their relevance to practice, valuing understanding and application of research to practice confidence using databases and the internet and doing critical appraisal

Funk et al. (1991); BARRIERS Scale Gilman (2011)

Gosling and Westbrook (2004)

Graham et al. (2013)

Attitudes to research and EBP Confidence in EBP skills

EBP behaviour Use in practice, perceived barriers to use Use of OTseeker and other databases, perceived ability to find relevant evidence, perceived contribution of OTseeker to changes in practice Use of, and access to, databases, experience of research changing practice Application of EBP to intervention, resources to support EBP Access to, and use of, research information Involvement in EBP activities Access to, and use of, evidence sources, barriers to implementing EBP Access to, and use of EBP resources, use of research to develop or alter treatment plans, facilitators and barriers to EBP

Perceived barriers to research utilization Information needs for practice, sources of evidence-based information used, use of online databases, success finding information, perceived ability to analyse and apply information in practice Use of an online evidence database Success finding evidence, impact of database on practice, perceived barriers to using the database Sources of evidence used Barriers and facilitators to EBP

(Continues)

82

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Buchanan et al.

Evidence-based Practice Survey Instruments

Table 3. (Continued) Source and name of instrument (if applicable) Heiwe et al. (2011)

EBP knowledge

Skills in EBP

Knowledge of information sources

Hu et al. (2012) Humphris et al. (2000)

Karlsson and Törnquist (2007)

McCluskey (2003)

McKenna et al. (2005)

Metcalfe et al. (2001); BART Pain et al. (1996); EROS

Self-rated EBP knowledge

Self-rated EBP skills

EBP behaviour Access to, and use of, evidence sources, applying practice guidelines

Perceptions and attitudes towards research-related activities as part of OT Perceived ability to perform research-related activities Attitudes to EBP Confidence in knowledge and skills related to EBP Perceptions of OTseeker

Perceived increase in knowledge as a result of the information on OTseeker (online database)

Perceived importance of research Self-rated knowledge of research concepts

Pain et al. (2004); adapted General Use of Research Pain et al. (2004); Knowledge Acquisition Survey Pain et al. (2004); Individual semi-structured interviews Philibert et al. (2003)

Involvement in EBP activities Accessing and implementing research to practice, barriers and facilitators to EBP Current and intended future involvement in research-related activities, barriers to engaging in research-related activities Frequency of completing EBP steps, Perceived barriers and solutions to EBP Access to and reasons for using OTseeker, information used in clinical decision-making, change in practice from using OTseeker Perceived barriers to research Research utilization Self-rated use of research Availability and use of information sources Use of research in practice

Attitudes towards research in practice

Pollock et al. (2000)

Pomeroy et al. (2003)

Attitudes to EBP Attitudes and beliefs towards EBP Perceived benefits and limitations of EBP Confidence in finding and appraising literature relevant to practice Views and perceptions of EBP Attitudes towards research and its use

Attitudes to EBP Confidence in searching for, reading, understanding and appraising literature Perceived improvement in EBP skills

Sources of knowledge to guide practice, use of research in practice Involvement in EBPrelated activities, reported use of EBP, perceived barriers to EBP Involvement in EBP Use of EBP

(Continues)

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

83

Buchanan et al.

Evidence-based Practice Survey Instruments

Table 3. (Continued) Source and name of instrument (if applicable)

EBP knowledge

Skills in EBP

Attitudes to EBP

Powell and Case-Smith (2003, 2010)

Salbach and Jaglal (2011); EPIC Scale

Confidence in ability to implement EBP (selfefficacy) Attitudes to EBP Confidence finding and appraising relevant research

Salls et al. (2009)

Stronge and Cahill (2012)modified KAB Questionnaire (Johnston et al., 2003) Sweetland and Craik (2001)

Self-rated knowledge

Upton and Lewis (1998)

Perceived knowledge of EBP and its individual steps

Upton and Upton (2006); modified Upton and Lewis questionnaire

Perceived knowledge of EBP

Van Mullem et al. (1999); KAP Survey

Knowledge of activities related to utilizing research 8

Total

EBP behaviour Information needs for practice, sources of evidence-based information used, use of online databases, success finding information, perceived ability to analyse and apply information in practice

Attitudes to EBP

Attitudes to EBP

Perceived skills in EBP

3

Willingness to engage in activities related to utilizing research 21

Involvement in EBPrelated activities, use of evidence in decisionmaking, factors affecting use of EBP Current and future use of EBP Barriers to EBP Frequency of using EBP, factors influencing EBP use Frequency of completing EBP steps, barriers and solutions to implementing EBP Frequency of completing EBP steps, likelihood of acting on evidence from different sources, barriers to implementing EBP Ability to perform activities related to utilizing research 33

EBP, evidence-based practice; BART, Barriers and Attitudes to Research in the Therapies; EPIC, Evidence-based Practice Confidence; EROS, Edmonton Research Orientation Scale; KAB, Knowledge, Attitudes, Behaviour; KAP, Knowledge Attitudes and Practice of Research.

which the study is to be conducted (Boynton and Greenhalgh, 2004). While modifications to language and wording may be needed to ensure an instrument is valid and reliable in a new context, if instruments are changed in any way, further validity and reliability testing should be carried out (Corr and Siddons, 2005). New instruments were developed for a number of the included studies despite caution from a number of authors about the expertise required, costs and time involved and the difficulty of making comparisons across studies (Bowling, 2009; Streiner and Norman, 84

2008). This was most likely due to the lack of instruments available at the time. A systematic review identified 104 instruments measuring EBP learner attributes (Shaneyfelt et al., 2008), none of which had been used with occupational therapists, suggesting a need to identify appropriate instruments for this group. The variety of instruments used in occupational therapy surveys of EBP makes comparisons across studies and across contexts difficult, if not impossible. In 2008, Shaneyfelt et al. (p 1125) concluded that ‘the science of evaluating EBP attitudes and behaviours Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Buchanan et al.

Evidence-based Practice Survey Instruments

Table IV. Methodological quality of included instruments (n = 34) Source and name of instrument (if applicable)

Internal consistency

Reliability

Content validity

Structural validity

Hypothesis testing

Cross-cultural validity

Clinical utility

Bennett et al. (2003) Bennett et al. (2007) Caldwell et al. (2007) Cameron et al. (2005) Cooke et al. (2008) Curtin and Jaramazovic (2001) Dopp et al. (2012) Dysart and Tomlin (2002) Funk et al. (1991) – BARRIERS Scale Gilman (2011) Gosling and Westbrook (2004) Graham et al. (2013) Heiwe et al. (2011) Hu et al. (2012) Humphris et al. (2000) Karlsson and Törnquist (2007) McCluskey (2003) McKenna et al. (2005) Metcalfe et al. (2001) – BART Pain et al. (1996) – EROS Pain et al. (2004) – adapted General Use of Research Pain et al. (2004) – Knowledge Acquisition Survey Pain et al. (2004) – individual semi-structured interviews Philibert et al. (2003) Pollock et al. (2000) Pomeroy et al. (2003) Powell and Case-Smith (2003, 2010) Salbach and Jaglal (2011) – EPIC Scale Salls et al. (2009) Stronge and Cahill (2012) – modified KAB Questionnaire (Johnston et al., 2003) Sweetland and Craik (2001) Upton and Lewis (1998) Upton and Upton (2006) – modified Upton and Lewis (1998) questionnaire Van Mullem et al. (1999) – KAP Survey

U U U U U U E U E

U U U U U U U U P

P U U E U U F U E

U U U E U U U U E

U U U U U U U U U

NA NA NA NA NA NA NA NA NA

G F F U P G F F U

U U U U U U E U U E E E

U U U U U U U U U U U U

U U U U U U U U U E E U

U U U U U U U U U E E U

U U U U U U U U U U G U

NA NA NA NA NA NA U NA NA NA NA NA

U F F U U U F E F U U U

U

U

U

U

U

NA

U

U

E

U

U

U

NA

U

E U U U

U U U U

E E U U

U U U U

E U U U

NA NA NA NA

F U U U

U

U

E

U

U

NA

F

U E

U U

U U

U E

U E

NA NA

F F

U E E

U E E

U E E

U U U

U U U

NA NA NA

F G U

E

E

E

E

U

NA

F

Scoring: E = excellent; G = good; F = fair; P = poor; NA = not applicable; U = unclear. BART, Barriers and Attitudes to Research in the Therapies; EPIC, Evidence-based Practice Confidence; EROS, Edmonton Research Orientation Scale; KAB, Knowledge, Attitudes, Behaviour; KAP, Knowledge Attitudes and Practice of Research.

continues to lag behind the evaluation of knowledge and skills’. They found that the highest proportion of instruments measured skills (39/104, 57%) followed by knowledge and behaviour (39/104, 38% each) and with attitudes the lowest (27/104, 26%; Shaneyfelt Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

et al., 2008). This trend seems to have changed, however, as the highest number of instruments in the current review measured behaviour (33/34) followed by attitudes (21/34), knowledge (8/34) and skills (3/34). In fact, the number of instruments measuring 85

Buchanan et al.

Evidence-based Practice Survey Instruments

Table V. Number of instruments by quality rating for each psychometric property (n = 34) Psychometric property

Quality rating

Reliability

Content validity

Structural validity

Hypothesis testing

Crosscultural validity

Clinical utility

0 0 0 11 0

1 0 0 4 0

1 1 0 10 0

0 0 0 6 0

0 0 1 2 0

0 0 0 0 33

1 14 3 1 0

23

29

22

28

31

1

15

Internal consistency

Poor Fair Good Excellent Not applicable Unclear

Table VI. Recommended instruments from occupational therapy studies by EBP learner attribute (n = 8) Source and name of instrument (if applicable)

EBP knowledge

Skills in EBP

Attitudes to EBP

Funk et al. (1991); BARRIERS Scale Metcalfe et al. (2001); BART Pain et al. (1996); EROS

Perceived importance of research Self-rated knowledge of research concepts

Philibert et al. (2003)

Stronge and Cahill (2012) – modified KAB Questionnaire (Johnston et al., 2003) Upton and Lewis (1998)

Attitudes towards research in practice Self-rated knowledge

Attitudes to EBP

Perceived knowledge of EBP and its individual steps

Attitudes to EBP

Upton and Upton (2006); modified Upton and Lewis questionnaire (Upton and Lewis, 1998)

Perceived knowledge of EBP

Van Mullem et al. (1999); KAP Survey

Knowledge of activities related to utilizing research

Total

Perceived skills in EBP

5

Willingness to engage in activities related to utilizing research 5

1

EBP behaviour Perceived barriers to research utilization Perceived barriers to research Research utilization Sources of knowledge to guide practice Use of research in practice Current and future use of EBP Barriers to EBP Frequency of completing EBP steps Barriers and solutions to implementing EBP Frequency of completing EBP steps Likelihood of acting on evidence from different sources Barriers to implementing EBP Ability to perform activities related to utilizing research 8

EBP, evidence-based practice; BART, Barriers and Attitudes to Research in the Therapies; EROS, Edmonton Research Orientation Scale; KAB, Knowledge, Attitudes, Behaviour; KAP, Knowledge Attitudes and Practice of Research.

knowledge and skills was limited. It must, however, be considered that the current review included only instruments used with occupational therapists, which may not reflect the scope of instruments available for health professionals in general. A development since the review of Shaneyfelt et al. is in an instrument for measuring self-efficacy (Salbach and Jaglal, 2011), 86

which has increasingly been recognized as having a strong impact on the implementation of EBP. Developments such as this demonstrate the continuous evolvement of the science of measurement related to EBP implementation. The eight instruments rated as ‘excellent’ for at least three measurement properties are recommended as the most robust instruments currently

Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Buchanan et al.

available for measuring EBP learner attributes in occupational therapy studies. The summary of these instruments provided in Table VI is useful for selecting the most appropriate instrument currently available for each EBP learner attribute. Self-report was used in all included instruments in the current systematic review. Although self-reports are useful for determining certain aspects, such as attitudes to EBP, objective measurement is important to obtain a true reflection of the actual situation. In selfreports, respondents may over-estimate their involvement to portray themselves in a positive way, thereby providing inaccurate information (McDowell, 2006). In addition, the way in which respondents interpret rating scales may introduce bias, for example, if they prefer to select answers on the end-points of the scale rather than in the middle (McDowell, 2006). Objective measures are thus required to reduce response bias. The strengths of this systematic review include the clear description of the search and selection processes, the use of multiple databases to ensure the identification of all published studies and the screening of papers by two independent reviewers to reduce bias (Higgins and Altman, 2008). Furthermore, the summary tables provide information to facilitate closer examination of the instruments based on their descriptive details (for example, study location and data collection method), the constructs measured and their psychometric properties. Difficulties encountered with the current review echoed those of previous researchers; namely, a variety of concepts were used for EBP learner outcomes (Glegg and Holsti, 2010), researchers failed to define the constructs measured (Glegg and Holsti, 2010) and varying definitions were used for EBP learner outcomes (Nabulsi et al., 2007). These difficulties made it challenging to classify instruments. To ensure consistency, the definitions of the different EBP learner attributes outlined at the beginning of this paper were used to classify instruments rather than the purpose stated by the authors. However, not all papers included the instrument, which made classification unclear in some cases. For these papers, the purpose stated in the paper was used. This may have resulted in some instruments, or sections thereof, being classified incorrectly. Ideally, copies of all instruments should have been obtained and examined and classified by two independent reviewers. Although one Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Evidence-based Practice Survey Instruments

paper had to be excluded as the full text was not available, information on the instrument was reported in three papers included in the review. It is therefore unlikely that information was missed. Searching only for instruments used with occupational therapists may have failed to identify those used with rehabilitation professionals or health professionals that included occupational therapists but did not disaggregate the data by professional group.

Conclusion This systematic review identified 34 instruments that have been used in surveys to measure knowledge, skills, attitudes and behaviour related to EBP in occupational therapy. Most instruments measured EBP behaviour (33/34), followed by attitudes (21/34), knowledge (8/34) and skills (3/34). All instruments were self-reports highlighting the need to develop objective measures that can quantify observed behaviour. Although a number of instruments are available, most have been developed or modified for particular studies, and few have undergone sufficient validity and reliability testing. Furthermore, no instruments were identified that have been used in studies conducted in middle-income or low-income countries. The review identified eight instruments with sufficiently robust properties to assess EBP learner outcomes.

Recommendations The occupational therapy profession should consider adopting a core set of instruments for assessing EBP learner outcomes and encourage their use in research across contexts. These instruments should have crosscultural applicability and be subject to a rigorous validation process. It would also be useful if they could be applied across health professions to enable comparisons across professional groups and contexts. Attention should be given to developing instruments that measure EBP learner attributes objectively with priority given to instruments that measure knowledge and skills. Research papers should include definitions of EBP learner attributes to enable comparison across studies. To this end, a standard set of definitions of EBP learner attributes should be devised. Lastly, as data from middle-income and low-income countries are noticeably absent from the literature, validation studies of survey instruments should be undertaken in these 87

Evidence-based Practice Survey Instruments

contexts so that a global view of EBP knowledge, skills, attitudes and behaviour among occupational therapists and students may be obtained.

Acknowledgements The first author would like to acknowledge the assistance of Professor Emeritus Meredith Harris (Northeastern University, Boston, and Mellon Scholar, University of Cape Town, South Africa) in the screening and selection of eligible papers for this review. This study was supported by the National Research Foundation and the University Research Committee of the University of Cape Town, South Africa (TTK2006041900018).

REFERENCES Bannigan K (2011). A global approach to evidence based occupational therapy: what progress has been made since 2006? WFOT Bulletin 64: 4–6. DOI:10.1179/ otb.2011.64.1.002. Bennett S, McKenna K, Hoffmann T, Tooth L, McCluskey A, Strong J (2007). The value of an evidence database for occupational therapists: an international online survey. International Journal of Medical Informatics 76: 507–513. Bennett S, Tooth L, McKenna K, Rodger S, Strong J, Ziviani J, Mickan S, Gibson L (2003). Perceptions of evidence-based practice: a survey of Australian occupational therapists. Australian Occupational Therapy Journal 50: 13–22. Bowling A (2009). Research Methods in Health. Investigating Health and Health Services (3rd edn). Berkshire: Open University Press. Boynton P, Greenhalgh T (2004). Selecting, designing, and developing your questionnaire. British Medical Journal 328: 1312–1315. DOI:10.1136/bmj.328.7451.1312. Brown T, Tseng MH, Casey J, McDonald R, Lyons C (2009). Knowledge, attitudes, practices and barriers of pediatric occupational therapists to evidence-based practice and research utilization. WFOT Bulletin 60: 38–48. Brown T, Tseng MH, Casey J, McDonald R, Lyons C (2010a). Predictors of research utilization among pediatric occupational therapists. OTJR: Occupation, Participation and Health 30: 172–183. DOI:10.3928/ 15394492-20091022-01. Brown T, Tseng MH, Casey J, McDonald R, Lyons C (2010b). Research knowledge, attitudes, and practices of pediatric occupational therapists in Australia, the 88

Buchanan et al.

United Kingdom, and Taiwan. Journal of Allied Health 39: 88–94. Buchanan H (2011). The uptake of evidence-based practice by occupational therapists in South Africa. WFOT Bulletin 64: 29–38. DOI:10.1179/otb.2011.64.1.008. Caldwell K, Coleman K, Copp G, Bell L, Ghazi F (2007). Preparing for professional practice: how well does professional training equip health and social care practitioners to engage in evidence-based practice? Nurse Education Today 27: 518–528. Cameron KAV, Ballantyne S, Kulbitsky A, Margolis-Gal M, Daugherty T, Ludwig F (2005). Utilization of evidence-based practice by registered occupational therapists. Occupational Therapy International 12: 123–136. Closs S, Lewin B (1998). Perceived barriers to research utilization: a survey of four therapies. British Journal of Therapy and Rehabilitation 5: 151–155. DOI:10.1268/ bjtr.1998.5.3.14095. Cooke J, Bacigalupo R, Halladay L, Norwood H (2008). Research use and support needs, and research activity in social care: a cross-sectional survey in two councils with social services responsibilities in the UK. Health and Social Care In The Community 16: 538–547. DOI:10.1111/j.1365-2524.2008.00776.x. Corr S, Siddons L (2005). An introduction to the selection of outcome measures. British Journal of Occupational Therapy 68: 202–206. DOI: 10.1177/030802260506800503 Curtin M, Jaramazovic E (2001). Occupational therapists’ views and perceptions of evidence-based practice. British Journal of Occupational Therapy 64: 214–222. Dopp CM, Steultjens EM, Radel J (2012). A survey of evidence-based practise among Dutch occupational therapists. Occupational Therapy International 19: 17–27. Dysart AM, Tomlin GS (2002). Factors related to evidence-based practice among U.S. occupational therapy clinicians. American Journal Of Occupational Therapy 56: 275–284. Eckerling S, Bergman R, Bar-Tal Y (1988). Perceptions and attitudes of academic nursing students to research. Journal of Advanced Nursing 13: 759–767. Ehrenfeldt M, Eckerling S (1991). Perceptions and attitudes of registered nurses to research. Journal of Advanced Nursing 16. Eller L, Kleber E, Wang S (2003). Research knowledge, attitudes and practices of health professionals. Nursing Outlook 51: 165–170. DOI:10.1016/s0029-6554(03)0012-x. Funk S, Wiese R, Champagne M, Tornquist E (1991). BARRIERS: the barriers to research utilization scale. Applied Nursing Research 4: 39–45. DOI: 10.1016/ s0897-1897(05)80052-7 Gilman IP (2011). Evidence-based information-seeking behaviors of occupational therapists: a survey of recent Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Buchanan et al.

graduates. Journal Of The Medical Library Association 99: 307–310. DOI:10.3163/1536-5050.99.4.009. Glegg S, Holsti L (2010). Measures of knowledge and skills for evidence-based practice: a systematic review. Canadian Journal of Occupational Therapy 77: 219–232. Gosling A, Westbrook J (2004). Allied health professionals’ use of online evidence: a survey of 790 staff working in the Australian public hospital system. International Journal of Medical Informatics 73: 391–401. Graham F, Robertson L, Anderson J (2013). New Zealand occupational therapists’ views on evidence-based practice: a replicated survey of attitudes, confidence and behaviours. Australian Occupational Therapy Journal 60: 120–128. DOI:10.1111/1440-1630.12000. Heiwe S, Kajermo KN, Tyni-Lenne R, Guidetti S, Samuelsson M, Andersson IL, Wengstrom Y (2011). Evidence-based practice: attitudes, knowledge and behaviour among allied health care professionals. International Journal of Quality in Health Care 23: 198–209. DOI:10.1093/intqhc/mzq083. Higgins J, Altman D (2008). Assessing risk of bias in included studies. In: Higgins J, Green S (eds). Cochrane Handbook for Systematic Reviews of Interventions. Version 501. Chichester: The Cochrane Collaboration. (pp. 8.1–8.50). Hu D, Burke JP, Thomas A (2012). Occupational therapists’ involvement views, and training needs of evidence-based practice: a rural perspective. International Journal of Therapy and Rehabilitation 19: 618–628. Humphris D, Littlejohns P, Victor C, O’Halloran P, Peacock J (2000). Implementing evidence-based practice: factors that influence the use of research evidence by occupational therapists. British Journal of Occupational Therapy 63: 516–522. Illott I, Taylor M, Bolanos C (2006). Evidence-based occupational therapy: it’s time to take a global approach. British Journal of Occupational Therapy 69: 38–41. DOI: 10.1177/030802260606900107 Jette D, Bacon K, Batt C, Carlson M, Ferland A, Hemingway R (2003). Evidence-based practice: beliefs, attitudes, knowledge, and behaviors of physical therapists. Physical Therapy 83: 786–802. Johnston J, Leung G, Fielding R, Tin K, Ho L (2003). The development and validation of a knowledge, attitude and behaviour questionnaire to assess undergraduate evidence-based practice teaching and learning. Medical Education 37: 992–1000. Karlsson U, Törnquist K (2007). What do Swedish occupational therapists feel about research? A survey of perceptions, attitudes, intentions, and engagement. Scandinavian Journal of Occupational Therapy 14: 221–229. Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Evidence-based Practice Survey Instruments

Kielhofner G (2006). Developing and evaluating quantitative data collection instruments. In: Kielhofner G (ed). Research in Occupational Therapy. Methods of Inquiry for Enhancing Practice. Philadelphia: F.A. Davis Company (pp. 155–176). Kirk S, Osmalov M, Fischer J (1976). Social workers’ involvement in research. Social Work 21: 121–124. Lyons C, Brown T, Tseng MH, Casey J, McDonald R (2011). Evidence-based practice and research utilisation: perceived research knowledge, attitudes, practices and barriers among Australian paediatric occupational therapists. Australian Occupational Therapy Journal 58: 178–186. DOI:10.1111/j.14401630.2010.00900.x. Lyons C, Casey J, Brown T, Tseng M, McDonald R (2010). Research knowledge, attitudes, practices and barriers among paediatric occupational therapists in the United Kingdom. British Journal of Occupational Therapy 73: 200–209. DOI:10.4276/030802210X12734991664147. McCluskey A (2003). Occupational therapists report on low level of knowledge, skill and involvement in evidence-based practice. Australian Occupational Therapy Journal 50: 3–12. McColl A, Smith H, White P, Field J (1998). General practitioners’ perceptions of the route to evidence based medicine: a questionnaire survey. British Medical Journal 316: 361–365. McDowell I (2006). Measuring Health. A Guide to Rating Scales and Questionnaires. New York: Oxford university Press. McKee W, Witt J, Elliot S, Pardue M, Judycki A (1987). Practice informing research: a survey of research dissemination and knowledge utilization. School of Psychology Review 16: 338–347. McKenna K, Bennett S, Dierselhuis Z, Hoffmann T, Tooth L, McCluskey A (2005). Australian occupational therapists’ use of an online evidence-based practice database (OTseeker). Health Information And Libraries Journal 22: 205–214. Menon A, Korner-Bitensky N, Kastner M, McKibbon K, Straus S (2009). Strategies for rehabilitation professionals to move evidence-based knowledge into practice: a systematic review. Journal of Rehabilitation Medicine 41: 1024–1032. Metcalfe C, Lewin R, Closs J, Hughes C, Perry S, Wright J (2000). Perceived barriers to conducting research in the NHS: a survey of four therapies. British Journal of Therapy and Rehabilitation 7: 168–175. Metcalfe C, Lewin R, Wisher S, Perry S, Bannigan K, Moffett J (2001). Barriers to implementing the evidence base in four NHS therapies: dietitians, occupational therapists, physiotherapists, speech and language therapists. Physiotherapy 87: 433–441. 89

Evidence-based Practice Survey Instruments

Moher D, Liberati A, Tetzlaff J, Altman D, The PRISMA Group (2009). Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Medicine 6: 336–341. DOI:10.1371/journal.pmed.1000097. Nabulsi M, Harris J, Letelier L, Ramos K, Hopayian K, Parkin C, Porzsolt F, Sestini P, Slavin M, Summerskill W (2007). Effectiveness of education in evidencebased healthcare: the current state of outcome assessments and a framework for future evaluations. International Journal of Evidence-Based Healthcare 5: 468–476. DOI:10.1097/01258363-200712000-00008. Pain K, Hagler P, Warren S (1996). Development of an instrument to evaluate the research orientation of clinical professionals. Canadian Journal of Rehabilitation 9: 93–100. Pain K, Magill-Evans J, Darrah J, Hagler P, Warren S (2004). Effects of profession and facility type on research utilization by rehabilitation professionals. Journal of Allied Health 33: 3–9. Parahoo K (2000). Barriers to, and facilitators of research utilization among nurses in Northern Ireland. Journal of Advanced Nursing 31: 89–98. Pettingill M, Gillies D, Chambers Clark C (1994). Factors encouraging and discouraging the use of nursing research findings. IMAGE: Journal of Nursing Scholarship 26: 143–147. Philibert D, Snyder P, Judd D, Windsor M (2003). Practitioners’ reading patterns, attitudes, and use of research reported in occupational therapy journals. American Journal of Occupational Therapy 57: 450–458. Pollock A, Legg L, Langhorne P, Sellars C (2000). Barriers to achieving evidence-based stroke rehabilitation. Clinical Rehabilitation 14: 611–617. Pomeroy V, Tallis R, Stitt E (2003). Dismantling some barriers to evidence-based rehabilitation with ‘handson’ clinical research secondments. Physiotherapy 89: 266–275. Powell C, Case-Smith J (2003). Information literacy skills of occupational therapy graduates: a survey of learning outcomes. Journal Of The Medical Library Association 91: 468–477. Powell C, Case-Smith J (2010). Information literacy skills of occupational therapy graduates: promoting evidence-based practice in the MOT curriculum. Medical Reference Services Quarterly 29: 363–380. Salbach NM, Jaglal SB (2011). Creation and validation of the evidence-based practice confidence scale for health care professionals. Journal of Evaluation in Clinical Practice 17: 794–800. DOI:10.1111/j.1365-2753.2010. 01478.x. Salls J, Dolhi C, Silverman L, Hansen M (2009). The use of evidence-based practice by occupational therapists. Occupational Therapy in Health Care 23: 134–145. 90

Buchanan et al.

Shaneyfelt T, Baum K, Bell D, Feldstein D, Houston T, Kaatz S, Whelan C, Green M (2008). Instruments for evaluating education in evidence-based practice. A systematic review. Journal of the American Medical Association 296: 1116–1127. DOI:10.1001/jama.296.9.1116. Streiner D, Norman G (2008). Health Measurement Scales. A Practical Guide to Their Development and Use. Oxford: Oxford University Press. Stronge M, Cahill M (2012). Self-reported knowledge, attitudes and behaviour towards evidence-based practice of occupational therapy students in Ireland. Occupational Therapy International 19: 7–16. DOI:10.1002/ oti.328. Sweetland J, Craik C (2001). The use of evidence-based practice by occupational therapists who treat adult stroke patients. British Journal of Occupational Therapy 64: 256–260. Terwee C, Mokkink L, Knol D, Ostelo R, Bouter L, de Vet H (2012). Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist. Quality of Life Research 21: 651–657. http://www.ncbi.nlm.nih. gov/pubmed/21732199 DOI: 10.1007/s11136-0119960-1 Upton D (1999a). Clinical effectiveness and EBP 2: attitudes of health-care professionals. British Journal of Therapy and Rehabilitation 6: 26–30. Upton D (1999b). Clinical effectiveness and EBP 3: application by health-care professionals. British Journal of Therapy and Rehabilitation 6: 86–90. Upton D, Lewis B (1998). Clinical effectiveness and EBP: design of a questionnaire. British Journal of Therapy and Rehabilitation 5: 647–650. DOI: 10.12968/ bjtr.1998.5.12.14028 Upton D, Upton P (2006). Knowledge and use of evidence-based practice by allied health and health science professionals in the United Kingdom. Journal of Allied Health 35: 127–133. Van Mullem C, Burke L, Dohmeyer K, Farrell M, Harvey S, John L, Kraly C, Rowley F, Sebern M, Twite K, Zapp R (1999). Strategic planning for research use in nursing practice. Journal of Nursing Administration 28: 39–45. DOI:10.1097/00005110-199912000-00008. Varcoe C, Hilton A (1995). Factors affecting acute-care nurses’ use of research findings. Canadian Journal of Nursing Research 27: 51–71. Waine M, Magill-Evans J, Pain K (1997). Alberta occupational therapists’ perspectives on and participation in research. Canadian Journal of Occupational Therapy 64: 82–88. Wojtczak A (2002). Glossary of medical education terms: part 3. Medical Teacher 24: 450–453. DOI:10.1080/ 0142159021000000861. Occup. Ther. Int. 23 (2016) 59–90 © 2015 John Wiley & Sons, Ltd.

Survey Instruments for Knowledge, Skills, Attitudes and Behaviour Related to Evidence-based Practice in Occupational Therapy: A Systematic Review.

The purpose of this study was to evaluate, through a systematic review, assessment instruments for evidence-based practice (EBP). The specific objecti...
463KB Sizes 0 Downloads 9 Views