M E T H O D O L O G Y PA P E R

A mixed-methods research approach to the review of competency standards for orthotist/prosthetists in Australia Susan Ash PhD, MHP, DipNutDiet, BSc, 1 Jackie O’Connor BPO (Hons), 2 Sarah Anderson MPH, BPO, 3 Emily Ridgewell PhD, BPO (Hons) 2 and Leigh Clarke MPH, BPO (Hons) 2 1

Queensland University of Technology, Brisbane, Queensland, 2Australian Orthotic Prosthetic Association, Greythorn, and 3National Centre for Prosthetics and Orthotics, La Trobe University, Bundoora, Victoria, Australia

ABSTRACT Aim: The requirement for an allied health workforce is expanding as the global burden of disease increases internationally. To safely meet the demand for an expanded workforce of orthotist/prosthetists in Australia, competency based standards, which are up-to-date and evidence-based, are required. The aims of this study were to determine the minimum level for entry into the orthotic/prosthetic profession; to develop entry level competency standards for the profession; and to validate the developed entry-level competency standards within the profession nationally, using an evidence-based approach. Methods: A mixed-methods research design was applied, using a three-step sequential exploratory design, where step 1 involved collecting and analyzing qualitative data from two focus groups; step 2 involved exploratory instrument development and testing, developing the draft competency standards; and step 3 involved quantitative data collection and analysis – a Delphi survey. In stage 1 (steps 1 and 2), the two focus groups – an expert and a recent graduate group of Australian orthotist/prosthetists – were led by an experienced facilitator, to identify gaps in the current competency standards and then to outline a key purpose, and work roles and tasks for the profession. The resulting domains and activities of the first draft of the competency standards were synthesized using thematic analysis. In stage 2 (step 3), the draft-competency standards were circulated to a purposive sample of the membership of the Australian Orthotic Prosthetic Association, using three rounds of Delphi survey. A project reference group of orthotist/prosthetists reviewed the results of both stages. Results: In stage 1, the expert (n ¼ 10) and the new graduate (n ¼ 8) groups separately identified work roles and tasks, which formed the initial draft of the competency standards. Further drafts were refined and performance criteria added by the project reference group, resulting in the final draft-competency standards. In stage 2, the final draft-competency standards were circulated to 56 members (n ¼ 44 final round) of the Association, who agreed on the key purpose, 6 domains, 18 activities, and 68 performance criteria of the final competency standards. Conclusion: This study outlines a rigorous and evidence-based mixed-methods approach for developing and endorsing professional competency standards, which is representative of the views of the profession of orthotist/ prosthetists. Key words: competency, Delphi survey, professional standards, prosthetics and orthotics, qualitative methods Int J Evid Based Healthc 2015; 13:93–103.

Background Correspondence: Susan Ash, PhD, MHP, DipNutDiet, BSc, Queensland University of Technology, Brisbane, Queensland, Australia. E-mail: [email protected]/[email protected] Received 12 August 2014 Revised 2 November 2014 Accepted 25 November 2014 DOI: 10.1097/XEB.0000000000000038

T

he WHO World Report on Disability recommends that rehabilitation services are multidisciplinary, effective, and accessible to those with disabilities. Approximately 15% of the world’s population experiences some form of disability. The disabling barriers include not only lack of services for healthcare and

International Journal of Evidence-Based Healthcare ß 2015 University of Adelaide, Joanna Briggs Institute

93

©2015 University of Adelaide, Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.

S Ash et al.

rehabilitation but also poor coordination of services, inadequate staffing, and weak staff competencies.1 Orthotist/prosthetists are tertiary qualified allied health professionals who clinically assess the physical and functional attributes of individuals with mobility and functional limitations. These limitations may result from illness, injury, and/or disability, including limb amputations. Orthotist/prosthetists may then prescribe and facilitate the provision of orthoses and prostheses to minimize the effect of these limitations. Australia currently has an orthotist/prosthetist workforce of approximately 400 practicing clinicians, which represents a low ratio per capita (1/56 552 of the population),2 compared to the UK benchmark for orthotists alone, of 1/30 555 of the population.3 Health workforce shortages have been recognized around the globe; however, simply training more of the same type of health workforce will not meet the needs of complex health environments. The WHO has recommended cooperation between educational institutions and health-employing agencies to match professional education to health service delivery.4 One of the key deliverables from the Australian Health Workforce Strategic Reform for Action is to create an adaptable health workforce that is equipped with the competencies to provide team-based and collaborative models of care.5 The Australian Orthotic Prosthetic Association (AOPA) developed its original competency standards in 1999, with small changes amended in 20036; however, neither process had involved an endorsement process by the membership. Training for the profession is focused in one university and the number of graduates from this source is not likely to meet the current or future workforce demands created by an increasingly ageing population and increased rates of chronic disease. Up-to-date competency standards are thus required to describe 21st Century practice of Australian orthotist/prosthetists and to ensure any internationally trained orthotist/prosthetists, wishing to fill workforce gaps and practise in Australia, have met transparent standards. Mixed-methods research can address exploratory and confirmatory questions using both qualitative and quantitative approaches.7 Narrative interviews and guided focus groups have been used in health professions to explore themes around professional competencies.7–10 Thematic analysis of interview data can provide a rigorous but flexible method for undertaking qualitative research that is methodologically sound.11 Braun and Clarke11 suggest that using written transcripts from initial data collection to assign meanings or codes; clustering these meanings into themes; integrating the results into an exhaustive map of the analysis; and 94

then validating the results with extracts and analysis from the relevant literature provide such a rigorous approach. The Delphi process is used to determine the level of consensus on an issue and uses a quantitative approach. It attempts to overcome the disadvantage of domination of the group by individuals or those with a vested interest. The extent to which each respondent agrees with the issue is measured on a numerical or categorical scale. Surveys are generated to explore the issue involved over numerous rounds – usually three – with participants anonymously rating their level of agreement.12 The first round of the Delphi often involves experts providing their opinions on a specific matter and the subsequent rounds providing the level of agreement. Critics of this method point out that agreement does not necessarily mean that the correct interpretation has occurred. Despite these reservations, the Delphi process has been used globally in health professional competency development, including nursing13 and nutrition and dietetics.14,15 The reactive Delphi is a variation of the traditional Delphi where information is presented directly into the first round of the survey. Whilst the reactive Delphi may constrain initial opinion,16 this technique is often utilized subsequent to previous research.13,16,17 The aim of this study is to describe the mixedmethods research methodology used by AOPA to determine the minimum level for entry into the orthotic/ prosthetic profession in Australia; to develop entry-level competency standards for the profession; and to validate the developed entry-level competency standards within the profession nationally.

Methods A mixed-method approach, using a three-step sequential exploratory design, was used, in which step 1 involved collecting and analyzing qualitative data from two focus groups; step 2 involved exploratory instrument development and testing, developing the draft ‘Competency Standards’; and step 3 involved quantitative data collection and analysis – a Delphi survey. The study was conducted in two stages and was overseen by a project reference group, which included the representatives of the professional association executive, those involved in research and university training, a project officer, and an experienced facilitator. Stage 1 (February–May 2013) involved the identification of gaps in the 2003 Competency Standards and development of the key purpose of the profession and the domains, activities, and performance indicators of the draft-competency standards. Stage 2 (November 2013–April 2014)

International Journal of Evidence-Based Healthcare ß 2015 University of Adelaide, Joanna Briggs Institute

©2015 University of Adelaide, Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.

METHODOLOGY PAPER

Stage 1 Exploratory

Focus Group–New Graduate

Focus Group-Expert

Thematic analysis Draft 1 CS

Reference Group refine Draft 2–5 CS

Focus Group 2–Expert Draft 6 CS

Stage 2 Confirmatory

Reference Group Draft 7 CS + Range Statements

Delphi Round 1 Activities of CS

Delphi Round 2–Only Activities without agreement + all PI

Delphi Round 3–Only PI without agreement and Key Purpose statement

CS, Competency Standards; PI, Performance Indicators

Figure 1. Mixed-methods approach to developing competency standard.

involved the confirmation or validation of these competency standards and their finalization (Fig. 1). Ethics approval was granted by the ‘deidentified’ Human Ethics Committee. In stage 1, semi-structured focus groups were used to elicit themes about the gaps, key purpose, work roles, and tasks of Australian orthotist/prosthetists from two groups – an expert group and a recent graduate group. In this study, we have defined domain as a work role, sometimes referred to in the literature as a unit of competency or standard; activity as a task performed within that work role, sometimes referred to as a core competency or element; and performance indicator as a statement of how the activity or task would be measured. A series of questions, shown below, was presented to both groups and discussion facilitated by the experienced facilitator with expertise in competency-standards

development and a practitioner of another allied health profession (Su.A.). The questions have been used previously in competency-standards development.9 Focus group questions: (1) What do you think are the gaps in the current Competency Standards? (2) What is the key purpose of the profession? (3) What is changing or likely to change in the profession that might affect this purpose? (4) What must happen for the key purpose to be achieved? (5) Why does the profession do it? (6) What major things would you have to do to perform that role? (7) Returning to the gaps identified earlier, do you wish to change or add anything? Focus group participants were all members of AOPA or teaching into the academic course and actively employed in the profession. Expert group participants had a minimum of 10 years’ experience in the orthotic/ prosthetic field and/or were known experts. Recruitment of the expert group was initially through direct approach via email. The recent graduate group graduated in Australia between 2009 and 2011, and was recruited through an advertisement on the AOPA website. Purposive sampling was used to ensure representation from different workplace settings within both groups, with a maximum number of 12 in each group. Focus groups were conducted face-to-face for the expert group and via teleconference/videoconference, when available, for the recent graduate group, and were recorded and transcribed verbatim. Two researchers – the project officer and experienced facilitator – undertook thematic analysis of the transcripts independently. Text in the transcripts was read to identify initial codes, which were underlined and annotated in margins. Major themes were identified from these codes and compared for congruence before being categorized into major work roles and counted. Relationships between work roles and the subtheme of work tasks were established. These themes were then tabulated as work roles and tasks. Themes relating to the key purpose of the profession were grouped to form four main statements. Gaps in the current competency standards were tabulated separately to inform the domains, activities, and performance indicators. Performance indicators were added to the domains and activities after the project reference group agreed on the domains and activities. The project reference group

International Journal of Evidence-Based Healthcare ß 2015 University of Adelaide, Joanna Briggs Institute

95

©2015 University of Adelaide, Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.

S Ash et al.

generated six iterations of this draft before the expert group met face-to-face to agree on the final draft, again facilitated by the same experienced facilitator (Su.A.). Between stages 1 and 2, the project reference group defined terms and contexts for assessment, via range statements. Range statements, sometimes referred to as range variables, are defined as ‘the part of a unit of competency which specifies the range of contexts and conditions to which the performance criteria apply’.18 Defining the range statements was the final stage of development of the draft-competency standards, which were used in stage 2. Stage 2 involved the use of the reactive Delphi technique to validate the draft-competency standards. Participants were required to be full or part-time financial members of AOPA who had consented to receive AOPA emails. The AOPA database was searched to determine membership demographics in relation to age, location, sex, and work type. From the 271 eligible members, 184 were identified as meeting the following purposive sample criteria for utilizing and/or being interested in competency standards; being internationally trained; teaching within the academic course; and being facility managers potentially employing graduate practitioners or supervising students on practice placement. Invitations to participate were sent via email to a subset of members (n ¼ 107) whose demographics reflected the broader membership. Surveys were constructed using SurveyMonkey19 and were distributed along with supporting documentation via the AOPA website. A personal email with completion instructions was sent to each participant in each round. Demographic and professional information was collected in round 1. Prior to dissemination, all surveys were piloted by the project reference group and five others who were not eligible to participate in the survey but had knowledge of and interest in the project. Surveys were completed over a 2-week period with reminder emails sent at 1 week, 72 h, and 24 h prior to closing the survey in order to reduce attrition rates. In round 1, the domains and activities were presented without the performance indicators and range statements. Participants expressed their level of agreement that each activity was required of an entry-level orthotist/prosthetist in order to achieve safe practice and positive client outcomes, using a 5-point Likert scale (strongly disagree, disagree, undecided, agree, and strongly agree) with the option for open-ended comment on all aspects of the document. The percentage of participants nominating each of the Likert categories was calculated. Agreement was defined as at least 75% of participants nominating either ‘strongly agree’ or 96

‘agree’. The competency standards were adjusted in light of feedback between rounds, and participants were provided with a summary of group results and an outline of any changes. In round 2, participants were asked to rate their agreement with any activities that did not achieve 75% agreement as well as all performance indicators. Range statements were included in round 2 in order to provide further context. In round 3, participants were asked to rate their agreement with any performance indicators, which did not achieve agreement in round 2, as well as their agreement with the key purpose of the profession. In this round, participants were also provided with an individual summary of round 2 results.

Results Sample characteristics In stage 1, 10 experts and 8 graduates participated in the focus groups. In stage 2, 52% (n ¼ 56) of those invited to participate consented to do so. Seventy-nine per cent (n ¼ 44) of those who consented completed all three rounds, resulting in an attrition rate of 21%. Ultimately, 16% of the membership participated in the three rounds of the Delphi process. The demographic details of the participants in stages 1 and 2 are shown in Table 1. Demographics were similar in both samples for stages 1 and 2. Development of competency standards Stage 1 Key themes identified as gaps in the current competency standards were similar in the expert and graduate focus groups. These included: an emphasis on evidence-based and ethical practice with a client/patient focus; the need for continuing professional development; and mentoring to remain professionally current and identification of scope of practice. Graduates felt very strongly that their technical skills were essential to their role. Statements about gaps included: Evidence: ‘you should have the skills to clinically justify why you are prescribing it, and yes, you should have the clinical skills to review it and see what the functional outcomes of your decisions are for a patient.’ (Expert B) Continuing professional development: ‘an essential skill would be to identify the gaps in your knowledge and in your practice, maybe it

International Journal of Evidence-Based Healthcare ß 2015 University of Adelaide, Joanna Briggs Institute

©2015 University of Adelaide, Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.

METHODOLOGY PAPER Table 1. Mean (SD) age and proportion (%) by female sex and work location of overall orthotist/ prosthetist profession, focus groups, and Delphi participants

Age (years) Women (%) a Victoria (%)

Membership profile (n ¼ 271)

Expert focus group (n ¼ 10, stage 1)

Graduate focus group (n ¼ 8, stage 1)

Delphi participants (3 rounds) (n ¼ 44, stage 2)

39 (11.6) 39 52

41 (9) 50 100

27 (5) 78 56

38 (11.8) 43 48

a

The sole training institution for orthotist/prosthetists in Australia is in Victoria, and the majority of graduates are employed in that state.

would be worth including something in having that skill. And also secondly following on from that, being able to access research and have the methods to be able to gain knowledge of best practice.’ (Graduate B)

The expert group had a more diverse range of views about the wording of the key purpose, perhaps reflecting their greater diversity of employment roles; however, the overall themes of clinical care and technical skill were similar.

Graduates were able to clearly define a key purpose, which involved the clinical decision-making and technical skills required to perform the role of the orthotist/prosthetist. Key purpose statements included:

‘we are able to conduct full bio-mechanical assessments and link those bio-mechanical assessment with intimate knowledge of materials, technology and gait goals and functional goal, to match materials, technology and patient presentation with that equipment, or that material to meet the patient outcomes.’ (Expert A)

‘I tell people I make artificial limbs’ (Graduate A) ‘we assess a patient, and then we come up with a prescription for their device, even if it’s related to a prosthesis and it’s pretty clear that they need one below the knee. . . there’s lots more involved with that, and perhaps saying a prescription is a good word to include. Also to include the fact that we clinically assess the patient using our clinical knowledge.’ (Graduate B)

Four key purpose statements were developed from the analysis of the initial focus groups for discussion during the second expert focus group. The themes around work roles and tasks from both groups were similar and are shown in Table 2. There were some areas which recent graduates thought should be covered in the competency standards and which the experts felt were beyond entry level.

Table 2. Initial themes developed from focus groups Work roles

Tasks

Collaborative client care

Manages patient assessment and treatment as part of a healthcare team Assesses, prescribes, treats, plans, and refers liaises with other professionals Uses evidence, justifies intervention Follows legislation, occupational health, and safety regulations Knows scope of practice Can use a range of modalities including e-health Advocates for patient in a variety of forums Advocating for other staff Manages self Manages budgets, funding systems, and administration of these Manufactures or oversee manufacture of custom devices and modifications Engages in self-development

Ethical and safe practice

Communicates using good written, oral, and interpersonal skills

Manages resources

Prescribes materials and technology within orthotist/prosthetist scope of practice Continuing professional development

International Journal of Evidence-Based Healthcare ß 2015 University of Adelaide, Joanna Briggs Institute

97

©2015 University of Adelaide, Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.

S Ash et al.

In the service management area, graduates gave examples of being in situations when they may have been the sole practitioner within months of graduation. ‘the senior clinician here . . .. had to be away for about nine weeks for a health issue, so I was left with the whole department without any technicians, and by myself, managing. . . doing the tech work and the clinical work here.’ (Graduate C) ‘I think in the future . . . we’ll all be involved with setting up more outreach clinics, and developing service delivery to the broader population.’ (Graduate B) Experts, on the contrary, focused more on skills that were minimum and how you could consistently assess these. ‘they need to have time management skills and project management skills’ (Expert B). ‘those key kind of graduate management skills, selfmanagement, time management are the core ones.’ (Expert C) The initial draft domains and activities were developed from Table 2, with reference to the 2003 Competency Standards and presented to the project reference group. At this point, reference was made to international orthotic and prosthetic and other Australian health professions’ competency standards. The process of adding range statements resulted in amendments to the competency standards to avoid repetition and increase clarity. The final draft was further refined to include 6 domains, 20 activities, and 69 performance criteria. Figure 2 shows the six domains of practice.

Stage 2: validation of the competency standards A total of 56 participants completed the first round of the Delphi survey. Participants agreed that 18 (or 90%) of the 20 activities were appropriate for an entry-level orthotist/prosthetist (Fig. 3). Agreement ranged from 87.5 to 100%, with 16 activities receiving more than 90% agreement. Two activities, both within domain 4 (service management and improvement), produced an undecided result: 4.3 – manages client funding allocation (53.6% agree, 33.9% undecided, 12.5% disagree); and 4.4 – participates in facility resource management (73.2% agree, 17.9% undecided, 8.9% disagree). 98

1. Collaborative practice

6. Life-long learning and reflective practice

2. Provision of clinical care

5. Professional values and behaviours

3. Provision of orthoses and prostheses 4. Service management and improvement

Figure 2. Domains of orthotic/prosthetic competency standards.

Comments from participants suggested that although the concept was correct, the wording indicated activities appropriate for a skill level higher than entry level. As activities 4.3 and 4.4 (described above) had an undecided result rather than disagreement, the reference group revised the activities within this domain to allow the intended concepts to remain. This process resulted in the concepts of activities 4.3 and 4.4 being incorporated into the performance criteria, rather than the other activities. Due to the high level of agreement in round 1, round 2 progressed to rating the performance indicators. A specific question asking for comments in relation to domain 4 activities was included to allow further feedback on the changes made from round 1. Forty-nine participants (87.5%) completed the second round of the survey. Participants were asked to rate whether the performance indicators described an observable and/or assessable action which is expected of the workforce when performing the relevant activity. Participants agreed that 68 out of the 69 performance indicators were assessable tasks of the relevant activity. Agreement levels ranged from 79.6 to 100%, with 66 performance indicators having an agreement level above 90% (Table 3). Participants did not agree on performance indicator 5.3.3 - acquires further qualifications to practice beyond professional scope of practice (54.2% agreement) - in domain 5 (professional values and behaviours). As in round 1, the project reference group revised the performance indicators with less than 75% agreement, based on the qualitative comments provided by the

International Journal of Evidence-Based Healthcare ß 2015 University of Adelaide, Joanna Briggs Institute

©2015 University of Adelaide, Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.

METHODOLOGY PAPER Delphi round 1 analysis Percentage agreement 0

10

20

30

40

50

60

70

80

90

100

Collaborative practice (1.1, 1.2, 1.3)

Domains

Provision of clinical care (2.1, 2.2, 2.3, 2.4, 2.5)

Provision of orthoses/prostheses (3.1, 3.2) Service management and improvement (4.1, 4.2, 4.3, 4.4, 4.5)

Professional behaviours (5.1, 5.2, 5.3)

Lifelong learning and reflective practice (6.1, 6.2)

Agree/strongly agree

Undecided

Disagree/strongly disagree

Figure 3. Delphi round 1 analysis of domains and activities.

participants. Although only one performance indicator did not achieve agreement, all three performance indicators within this activity (activity 5.3) were presented to participants for re-assessment in round 3. In round 3, participants (n ¼ 44) agreed that these three revised performance indicators (listed below) were all assessable tasks of the relevant activity: 5.3.1 – works within professional scope of practice and authority provided by the client and employer (95.4% agreement); 5.3.2 – seeks assistance or refers on when beyond own level of competence (97.7% agreement); and 5.3.3 – recognizes where further training is required to conduct independent practice (97.7% agreement).

Discussion The study is one of the few outlining the mixed-methods research methodology13,16,20 used to develop or review competency standards in health professions. Other studies have used similar methodology; however, few have combined rigorous qualitative and quantitative methodologies to develop and then validate professional standards. Sherbino et al.20 conducted a mixed-methods study to define the key roles and competencies of ClinicianEducators. Experts participated in five focus groups (final sample, n ¼ 22) to define attributes, domains of competence, and core competencies. One national survey of

key educator stakeholders (n ¼ 1110) validated the results, using a mix of 4-point categorical responses, such as agree/disagree and important/not important. Mixing categorical responses without a second round of surveying potentially weakens the overall result. O’Connell and Gardner8 used criterion sampling of emergency practice nurses (n ¼ 5) to develop a competency framework for emergency nursing practice. The draft specialist competencies and performance indicators were sent to an expert panel (n ¼ 12) via a series of Delphi surveys for agreement. The numbers were small and possibly not representative of the nursing profession; however, both these studies used similar methodology as the current study. One of the unique features of our study was the use of separate expert and graduate focus groups. The experts were chosen to represent key stakeholders, especially officers of the professional association, previous authors of the competency standards, university educators, and managers of large departments employing new graduate orthotist/prosthetists. This is similar to other studies in allied health.21 The graduate focus group was chosen to represent relatively recent graduates who may be working in emerging areas of practice. Interviews of recent graduates in dietetics have shown that they often describe working in emerging situations, which senior practitioners would not consider entry level.9,22 As a

International Journal of Evidence-Based Healthcare ß 2015 University of Adelaide, Joanna Briggs Institute

99

©2015 University of Adelaide, Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.

S Ash et al.

Table 3. Percentage agreement for performance indicators in rounds 2 and 3 Delphi surveys Performance indicator – final wording

Round 2

1.1.1 – Ensures all interactions with the client and/or carer demonstrate respect, honesty, empathy and dignity, and are conducted in a culturally appropriate manner 1.1.2 – Ensures the client is the focus of the care pathway 1.1.3 – Ensures the client and/or carer is aware of their rights and responsibilities 1.1.4 – Obtains informed consent from the client and/or carer prior to the provision of care 1.1.5 – Listens effectively to the client and/or carer 1.1.6 – Encourages the client and/or carer to participate and provide feedback 1.1.7 – Provides prompt, accurate, and comprehensive information in clear terms to enable clients and/ or carers to make informed decisions 1.2.1 – Receives and formulates client referrals, professional handovers, healthcare team reports, and other treatment plans 1.2.2 – Respects, acknowledges, and utilizes the expertise of other health professionals 1.2.3 – Establishes and maintains effective working relationships with other health professionals to enhance collaborative practice and access to care 1.2.4 – Actively participates in health care teams and seeks opportunities to demonstrate professional excellence 1.3.1 – Provides clinical justification and evidence for prescribed orthotic/prosthetic client care 1.3.2 – Provides relevant information in order to facilitate client access to care 2.1.1 – Identifies subjective and objective information to enable development of an appropriate orthotic/prosthetic management plan 2.1.2 – Selects assessment techniques, outcome measures and other tools/instruments, based on evidence which are relevant to the client’s presentation 2.1.3 – Performs assessment professionally, safely, and effectively 2.2.1 – Accesses and utilizes the best available evidence to guide clinical decisions 2.3.1 – Facilitate client and/or carer to establish personal goals 2.3.2 – Considers the information obtained, the client and/or carer’s goals and available evidence when formulating treatment options 2.3.3 – Discusses treatment options with the client and/or carer to support client-centered care and informed choice 2.3.4 – Discusses short and long-term treatment goals with the client and/or carer 2.3.5 – Identifies clients who require collaborative care and liaises with the health professional team to ensure integrated care planning 2.3.6 – Determines and justifies the design details of the orthosis and/or prosthesis prescription 2.3.7 – Includes client, carer and/or health professional team education and follow-up when planning treatment 2.3.8 – Selects appropriate outcome measures 2.4.1 – Considers all relevant characteristics of the client during orthosis/prosthesis fitting and review processes 2.4.2 – Uses appropriate techniques to ensure optimal fit and function of the orthosis/prosthesis 2.4.3 – Reviews the client at appropriate intervals to evaluate fit, function, quality, and safety of the orthosis/prosthesis 2.4.4 – Evaluates and monitors treatment outcomes using patient feedback and/or outcome measures 2.4.5 – Modifies treatment to ensure best possible outcomes are maintained 2.4.6 – Discusses progress toward goals with the client and/or carer 2.5.1 – Adheres to legislative and organisational requirements for all documentation 2.5.2 – Maintains legible, concise and accurate documentation using contemporary methods 2.5.3 – Safely and securely stores information and acts to maintain confidentiality whilst ensuring availability of information to other health professionals involved in the care pathway 3.1.1 – Utilizes appropriate casting, measuring and/or cast modification techniques to facilitate fabrication 3.1.2 – Fabricates and/or coordinates the optimal fabrication of orthoses/prostheses 3.1.3 – Performs and/or coordinates required modifications of orthoses/prostheses 3.2.1 Assesses the orthosis/prosthesis for structural safety at appropriate intervals 3.2.2 – Ensures the orthosis/prosthesis is compliant with manufacturer guidelines and standards 4.1.1 – Facilitates appropriate completion of all supportive activities 4.1.2 – Facilitates appropriate completion of treatment provision 4.1.3 – Demonstrates an ability to triage individual client case load within broader facility workload

95.9

100

Round 3

91.8 93.9 95.9 100 98 95.9 98 100 100 89.8 93.9 79.6 98 95.9 100 93.8 93.9 100 98 91.8 98 95.9 95.9 81.7 100 93.9 100 98 95.9 93.9 98 98 93.8 100 96 93.8 93.9 95.9 87.8 93.9 95.9

International Journal of Evidence-Based Healthcare ß 2015 University of Adelaide, Joanna Briggs Institute

©2015 University of Adelaide, Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.

METHODOLOGY PAPER Table 3. (Continued) Performance indicator – final wording

Round 2

4.2.1 – Determines available funding for prescribed care plan 4.2.2 – Prepares and/or coordinates submission of documentation for client funding support as required 4.2.3 – Prescribes and designs orthosis/prosthesis to achieve optimal outcomes within the approved budget for client care 4.2.4 – Understands and conforms to funding arrangements, budget allocations, statistical reporting and financial transaction requirements relevant to the work place 4.3.1 – Strives to continually improve efficiency 4.3.2 – Recognizes service gaps or inefficiencies and works collaboratively to identify solutions 4.3.3 – Participates in audit processes and quality improvement initiatives 5.1.1 – Adheres to legislation and workplace guidelines relating to safety 5.1.2 – Identifies workplace hazards and acts to eliminate or reduce risks 5.2.1 – Complies with relevant legal laws, regulations, policies and guidelines 5.2.2 – Abides by applicable codes of ethics and conduct 5.2.3 – Recognizes the responsibility to do no harm 5.2.4 – Recognizes and responds appropriately if client and/or carer is at risk 5.3.1 – Works within professional and personal scope of practice and authority provided by the client and employer (round 2) 5.3.1 – Works within professional scope of practice and authority provided by the client and employer (round 3) 5.3.2 – Acquires further training and assessment to develop personal scope of practice 5.3.2 – Seeks assistance or refers on when beyond own level of competence 5.3.3 – Acquires further qualifications to practise beyond professional scope of practice (round 2) 5.3.3 – Recognizes where further training is required to conduct independent practice (round 3) 6.1.1 – Undertakes independent learning to further own knowledge and skills on a continuous basis 6.1.2 – Shares skills and knowledge with health professional colleagues and students 6.1.3 – Participates in health professional training and research as opportunities arise 6.1.4 – Seeks out leaders in the profession for advice and mentoring 6.1.5 – Offers constructive feedback and assistance to other health professionals 6.2.1 – Assesses and critically analyses research literature and other sources of evidence to improve practice 6.2.2 – Demonstrates a systematic approach to analysis and decision making 6.2.3 – Integrates the best available evidence and new learning into practice to improve health outcomes for clients 6.2.4 – Demonstrates knowledge of new techniques and technology relevant to orthotics/prosthetics 6.2.5 – Critically and continuously evaluates practice

95.8 95.9

stakeholder group, graduates are rarely approached as a separate group in the review of competency standards in other allied health professions. Much discussion occurred in the expert group about whether the standards were minimum on graduation or aspirational, describing entry-level practice in the first 6–12 months. The very definition of competency having to be assessed in the workplace makes the former, that is minimum on graduation, difficult to achieve; however, for professional registration or recognition purposes, this is the norm.21,23 Despite this, the resulting domains or work roles, shown in Fig. 2, were grouped similarly to other health professions; collaboration/sharing/communicating; therapeutic role; prioritizing workload; ethical care; promoting best practice; professionalism24; or Clinical Expert, Communicator, Collaborator, Manager, Health Advocate, Scholar, and Professional.25

Round 3

91.7 91.6 89.6 91.7 95.8 100 100 100 100 97.9 100 97.9 95.4 91.7 97.7 54.2 97.7 95.8 100 89.6 97.9 95.8 91.6 91.7 97.9 81.3 93.7

Several iterations of the initial draft were considered first by the project reference group, prior to presentation to the second expert focus group for discussion. The project reference group again refined the draftcompetency standards prior to the final draft being used in the Delphi process. This process – step 2 in the exploratory sequential model outlined above, and outlined in Fig. 1 – allowed not only reflection and integration of views but also review of appropriate literature and other professional competency standards. The quantitative step – step 3 or Delphi process allowed a ranking of agreement from a wider group of participants. Jones and Hunter12 describe consensus methods as needing anonymity, iteration, controlled feedback, and statistical group response (only if scores are sequential and not categorical). Our results from the first Delphi round showed remarkable agreement, with

International Journal of Evidence-Based Healthcare ß 2015 University of Adelaide, Joanna Briggs Institute

101

©2015 University of Adelaide, Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.

S Ash et al.

16 activities achieving above 90% consensus. Revision of the Service Management and Improvement domain – the only domain where activities showed less than 90% agreement – resulted in some activities being re-written as performance indicators. This was consistent with both focus groups perceiving management differently, at entry level. Comments from participants indicated that much of the lack of agreement involved wording of the activity rather than the activity itself. Therefore, in the second round of Delphi, all revised activities were shown, but participants rated their level of agreement with the 69 performance indicators, within those activities. This modified Delphi method is similar to that used by pharmacists in primary healthcare16 where an online survey was used to agree on competency descriptions, organized into domains, elements, and subelements. Participants were then asked to rank on a 6-point Likert scale how often they performed the subelement (performance indicator) and how critical they thought the subelement was to achieve patient outcomes. The results expressed as importance rankings were seen as validating the original competency statements. Other studies have used subsequent rounds of Delphi to improve agreement. Hughes et al.14 used two rounds of Delphi surveys to improve agreement on 143 competency elements (activities) in public health nutrition; however, they only reported the shift in agreement on the 109 elements for which there was above 67% agreement. Others in advanced dietetics and nursing have reported changed median17 or mean score,8,13 respectively, between rounds. The high level of agreement in our study for all rounds of the Delphi was not unlike other studies, although other professional competency studies used different meanings for their rating scales and different methods for computing the rankings. Generally, consensus levels are high. Hughes et al.14 only reported 33 (out of a possible 143) competencies rated as essential by 100% of the participants. O’Connell and Gardner8 reported mean agreement scores ranging from 4.6 to 4.9 out of a maximum score of 5. The strength of this study is that it used a structured exploratory and confirmatory approach with accepted qualitative and quantitative methods to develop and then validate minimum entry-level competency standards for orthotist/prosthetists in Australia. The level of agreement was very high (>87% for 18 out of 20 activities, see Fig. 3). When there was no agreement in the first rounds of the Delphi survey, re-wording resulted in very high agreement on subsequent rounds (>80% for all 18 activities and 68 performance criteria, see Table 2). 102

Limitations of the study include the small numbers, with only 16% of the total membership involved in the validation; however, these numbers and the attrition rate of 21% are consistent with other studies of competency validation in other professions.8,14,16 Most participants graduated from the only program in Australia, which could suggest that orthotist/prosthetist experience is constrained by training in only one institution in Victoria. Every attempt, however, was used to ensure representativeness of the profession in Australia. Due to the purposive sampling criteria, only 184 out of the 271 members were eligible to participate, and only 107 members were invited to participate to reflect the membership profile. Every attempt was made to include orthotist/prosthetists who were members of AOPA, but had trained outside Australia, and participants were selected to be representative of the AOPA membership, including those who worked outside the metropolitan area. Table 1 shows that despite a large percentage of respondents being Victorian, this is representative of the membership and where the majority practise. Further research could apply these competency standards to assess entry-level practice for graduates exiting an education program, or alternatively, orthotist/prosthetists from other countries. Overall, this study has outlined an evidence-based approach to the review of competency standards for a health profession, such as orthotist/prosthetists. The methodology is robust and could be applied to other disciplines.

Acknowledgements The work undertaken in this study was funded by a Professional Services Development Program grant from the Australian Government. Thanks are extended to Shane Grant, project officer for stage 1 of the project.

References 1. World Health Organisation. Summary world report on disability: World Health Organisation and World Bank. 2011. http://whqlibdoc.who.int/hq/2011/WHO_NMH_VIP_ 11.01_eng.pdf?ua=1. [Accessed 30 April 2014] 2. Australian Bureau of Statistics. Customised report: census of population and housing; customized data report code 251912; orthotist or prosthetist. 2014. 3. National Health Service Scotland. Scottish orthotic services review Edinburgh: National Health Service, Scotland. 2005. http://www.sehd.scot.nhs.uk/publications/dc20050614 orthotics.pdf. 4. World Health Organisation. Transformative scale up of health professional education. Geneva, Switzerland: World Health Organisation; 2011. http://whqlibdoc.who.

International Journal of Evidence-Based Healthcare ß 2015 University of Adelaide, Joanna Briggs Institute

©2015 University of Adelaide, Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.

METHODOLOGY PAPER

5.

6.

7.

8.

9.

10.

11. 12.

13.

14.

int/publications/2011/9789240685215_eng.pdf. [Accessed 30 April 2014] Health Workforce Australia. Health Workforce Australia: 2011. National Health Workforce Innovation and Reform Strategic Framework for Action 2011–2015. Adelaide, South Australia: Health Workforce Australia; 2011. https:// http://www.hwa.gov.au/sites/uploads/hwa-wir-strategicframework-for-action-201110.pdf. The Australian Orthotic Prosthetic Association. Entry Level Competency Standards for Australian Orthotist/Prosthetists (3rd edition). Melbourne; 2014. http://www.aopa.org.au/ documents/item/27. [Accessed 15 December 2014] Larkin PM, Begley CM, Devane D. Breaking from binaries: using a sequential mixed methods design. Nurse Res 2014; 21: 8–12. O’Connell J, Gardner G. Development of clinical competencies for emergency nurse practitioners: a pilot study. Australas Emerg Nurs J 2012; 15: 195–201. Ash S, Gonczi A, Hager P. Combining research methodologies to develop competency-based standards for dietitians: a case study for the professions. National Office of Overseas Skills Recognition Research paper no. 6. DEET. Canberra: Australian Government Publishing Service; 1992 . Phillips S, Ash S, Tapsell L. Relevance of the competency standards to entry level dietetic practice. Aust J Nutr Dietet 2000; 57: 198–207. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol 2006; 3: 77–101. Jones J, Hunter D. Consensus methods for medical and health services research. Br Med J (Clin Res Ed) 1995; 311: 376–80. Chang AM, Gardner GE, Duffield C, Ramis M-A. A Delphi study to validate an advanced practice nursing tool. J Adv Nurs 2010; 66: 2320–30. Hughes R, Begley A, Yeatman H. Aspirational competency expectations for public health nutritionists in Australia: a consensus study. Nutr Dietet 2013. doi:10.1111/17470080.12098; [Epub 13 November 2013].

15. Davies J, Hughes R, Margetts B. Towards an international system of professional recognition for public health nutritionists: a feasibility study within the European Union. Public Health Nutr 2012; 15: 2005–11. 16. Kennie-Kaulbach N, Farrell B, Ward N, et al. Pharmacist provision of primary health care: a modified Delphi validation of pharmacists’ competencies. BMC Fam Pract 2012; 13: 27. 17. Brody RA, Byham-Gray L, Touger-Decker R, et al. Identifying components of advanced-level clinical nutrition practice: a Delphi study. J Acad Nutr Dietet 2012; 112: 859–69. 18. Naidu R, Stanwick, J., Frazer, K. Glossary of terms Adelaide, South Australia: National Centre for Vocational Education Research; 2013. http://www.voced.edu.au/content/glossary-vet. 19. SurveyMonkey Inc. SurveyMonkey. Palo Alto, California, USA; 2013. 20. Sherbino J, Frank JR, Snell L. Defining the key roles and competencies of the clinician-educator of the 21st Century: a national mixed-methods study. Acad Med 2014; 89: 783–9. 21. Rodger S, Clark M, Banks R, et al. A national evaluation of the Australian Occupational Therapy Competency Standards (1994): a multistakeholder perspective. Aust Occup Ther J 2009; 56: 384–92. 22. Dowding K, Ash S, Shakespeare-Finch J. Using critical incident interviews to identify the mental health knowledge, skills and attitudes of entry-level dietitians. Nutr Dietet 2011; 68: 297–304. 23. Australian New Zealand Podiatry Accreditation Council. Podiatry competency standards for Australia and New Zealand 2012. http://www.anzpac.org.au/files/Podiatry Competency Standards for Australia and New Zealand V1.1 211212 (Final).pdf. [Accessed 12 August 2014] 24. Dewing J, Traynor V. Admiral nursing competency project: practice development and action research. J Clin Nurs 2005; 14: 695–703. 25. Frank JR, Danoff D. The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Med Teacher 2007; 29: 642–7.

International Journal of Evidence-Based Healthcare ß 2015 University of Adelaide, Joanna Briggs Institute

103

©2015 University of Adelaide, Joanna Briggs Institute. Unauthorized reproduction of this article is prohibited.

prosthetists in Australia.

The requirement for an allied health workforce is expanding as the global burden of disease increases internationally. To safely meet the demand for a...
345KB Sizes 0 Downloads 10 Views