Downloaded from http://emj.bmj.com/ on August 31, 2015 - Published by group.bmj.com

EMJ Online First, published on July 6, 2015 as 10.1136/emermed-2014-204363 Original article

The development and evaluation of an evidencebased guideline programme to improve care in a paediatric emergency department Ayobami T Akenroye,1,2 Anne M Stack1,2 ▸ Additional material is published online only. To view please visit the journal online (http://dx.doi.org/10.1136/ emermed-2014-204363). 1

Division of Emergency Medicine, Boston Children’s Hospital, Boston, Massachusetts, USA 2 Department of Pediatrics, Harvard Medical School, Boston, Massachusetts, USA Correspondence to Anne M Stack, Boston Children’s Hospital, 300 Longwood Avenue, Boston, MA 02115, USA; [email protected]. edu Received 29 September 2014 Revised 28 May 2015 Accepted 12 June 2015

ABSTRACT Introduction Care guidelines can improve the quality of care by making current evidence available in a concise format. Emergency departments (EDs) are an ideal site for guidelines given the wide variety of presenting conditions and treating providers, and the need for timely decision making. We designed a programme for guideline development and implementation and evaluated its impact in an ED. Methods The setting was an urban paediatric ED with an annual volume of 60 000. Common and/or high-risk conditions were identified for guideline development. Following implementation of the guidelines, their impact on effectiveness of care, patient outcomes, efficiency and equitability of care was assessed using a web-based provider survey and performance on identified metrics. Variation in clinical care between providers was assessed using funnel plots. Results Eleven (11) guidelines were developed and implemented. 3 years after the initiation of the programme, self-reported adherence to recommendations was high (95% for physicians and 89% for nurses). 97% of physicians and 92% of nurses stated that the programme improved the quality of care in the ED. For some guidelines, provider-to-provider care practice variation was reduced significantly. We found reduced disparity in imaging when assessing one guideline. There were also reductions in utilisation of diagnostic tests or therapies. As a balancing measure, the percentage of patients with any of the guideline conditions who returned to the ED within 72 h of discharge did not change from before to after guideline initiation. Overall, 80% of physician and 56% of nurse respondents rated the guideline programme at the highest value. Conclusions A programme for guideline development and implementation helped to improve efficiency, and standardise and eliminate disparities in emergency care without jeopardising patient outcomes.

INTRODUCTION

To cite: Akenroye AT, Stack AM. Emerg Med J Published Online First: [please include Day Month Year] doi:10.1136/emermed2014-204363

Evidence-based guidelines (EBGs), have been increasingly applied in the healthcare setting with the goals of reducing variation and improving care.1 By standardising care processes, patient outcomes may be improved and care can be studied and modified based on outcomes.2 The development of a guideline, though time-consuming and resource-intensive,3 4 is however a single step in the process of improving patient care. EBGs are only useful if there is effective translation to the bedside.2 4–6 It is therefore essential that they are fully adopted and used in the local setting.

Key messages What is already known on this subject? Clinical guidelines can be used to standardise care and reduce unnecessary resource utilisation. Despite the presence of national or local guidelines, adoption by physicians can be challenging. What might this study add? We developed and implemented a set of evidence-based guidelines that were readily adopted by a large group of clinicians. Use of guidelines led to decreased variation in care and resource utilisation. Clinicians rated the programme highly.

However, implementation poses a challenge.7–12 Furthermore, implementation in a complex environment of care such as an emergency department (ED) can be even more challenging. There is often high acuity, high patient volume, concurrent demand for timely care in acutely ill patients and variability in providers. Like many other academic settings, ours has a large number of physicians and nurses with a range in experience and training. We recognised the need to reduce variation in care and improve efficiency. The objective of this paper is to describe how we developed and/or adopted EBGs specific to paediatric emergency care, the implementation strategy used and the methods applied to evaluate the success of the programme. We also provide information on achievements in standardising care and decreasing resource utilisation. Moreover, we share the lessons learnt from the establishment of this programme that may be helpful to clinicians or administrators interested in developing a local improvement portfolio.

METHODS Setting The Boston Children’s Hospital ED is a tertiary, university affiliated paediatric hospital with an annual volume of 60 000 patients and an admission rate of ∼18%. Providers include 45 paediatric emergency medicine specialists and 25 general paediatricians, 72 nurses, 16 emergency medicine fellows and about 200 rotating residents. Guidelines for many common paediatric emergency

Akenroye AT, et al. Emerg Med J 2015;0:1–9. doi:10.1136/emermed-2014-204363

Copyright Article author (or their employer) 2015. Produced by BMJ Publishing Group Ltd under licence.

1

Downloaded from http://emj.bmj.com/ on August 31, 2015 - Published by group.bmj.com

Original article conditions did not exist or were not formally adopted locally if national guidelines existed.

Human subjects protection According to the policy for activities that constitute research at Boston Children’s Hospital, this work met criteria for operational improvement activities exempt from ethics review.

PLANNING THE INTERVENTION EBG team Leadership support was critical to the development of an EBG programme in the ED. The EBG team was made up of division leaders including the division chief and clinical chief; individual EBG champions consisting of a paediatric emergency medicine physician and a nurse partner; quality improvement (QI) specialist, data analyst, ED electronic medical record (EMR) liaison, ED pharmacist and administrative support. Support and input from providers in subspecialty departments was sought as appropriate.

Building an organisational culture of evidence-based practice The EBG programme was built on the evolving normative culture of evidence-based practice in the department accomplished by: ▸ Clear communication to providers on the expectation of providing evidence-based care. This was done by: (A) discussion of current and emerging evidence on case management at the physicians’ weekly 4-h conference; (B) development of a local evidence-based medicine (EBM) website that houses key publications in emergency medicine updated at least yearly by faculty for their respective EBM topics; and (C) monthly ‘Journal Watch’ where pertinent articles recently published in peer-reviewed journals are discussed. ▸ Nurturing of openness and transparency including monthly morbidity and mortality conference where providers critically evaluate care in a collegial environment. ▸ Creation of robust support mechanisms to improve evidencebased practice such as through the use of computerised order entry and employment of quality personnel. ▸ Creation of a nimble, didactic conference environment where ideas are shared and support is garnered.

Developing the EBG From preliminary data, ED leadership had identified variation in care practices and wide differences in resource utilisation for certain conditions. Following the decision to initiate a local EBG programme, physicians were encouraged to come forward with candidate conditions for guidelines. Conditions were prioritised based on: number of patients affected; potential for high impact such as a disease with high morbidity and/or mortality; potential for elimination of variation in practice patterns and potential for improved efficiency through the reduction of unnecessary resource utilisation. The EBG development process as designed by the division leaders is as follows: the chief and clinical chief of the division met with the EBG physician champion to discuss the general principles of guideline development using the AGREE instrument as a guide,13 and provided an orientation to the development process and specific considerations for the guideline, for example the eligible patient population. The scope of the guideline was established using PICO framework ( patient/problem, intervention, comparison, outcomes).1 National guidelines, if available, were studied by the physician and updated with 2

current evidence. They were also adapted to fit the local context and we laid emphasis on areas that we had identified with significant evidence-to-practice gaps in our ED. For instance, we added contact precautions to our bronchiolitis guideline for management of patients with suspected/confirmed respiratory syncytial virus (RSV) infection. The major and up-to-date recommendations of each guideline were kept as it is. We did, however, empower the attending physician to deviate from the guideline for individual cases as appropriate. To this effect, we adapted the American Academy of Pediatrics guidelines on classifying EBG recommendations,14 and used language suggestive of the strength of the evidence. For instance, the word ‘should’ was used where there was strong evidence supporting the recommendation whereas in areas where evidence was equivocal the words ‘may’ or ‘consider’ were used to allow flexibility with the aim of learning from the clinical expertise and rationale of providers for their management options. Next, at least five physician and five nurse colleagues and/or subspecialists were engaged to seek their expert opinion on the scientific soundness and usability of the guideline. Once a draft algorithm was created, the recommendations and supporting evidence were discussed in departmental conference and helped to gather local support. On the internal website, providers could also access the EBM folder which housed the supporting literature. Prior to implementation, pharmacist review for medication choice and accuracy was completed. The ED EMR physician liaison was crucial to the success of supporting computerised order entry. Some EBG champions preferred an ‘all-inclusive’ order set with many options while others preferred a more streamlined approach such that physicians were guided to choose the recommended care path. We allowed latitude here and hope in future iterations to study the impact of the stringency of the order set on recommendation adherence. Prior to implementation, completed guidelines included an entire package of deliverables: the care algorithm, an electronic order set, comprehensive discharge instructions and quality measures.

Implementation strategies Initially, once the guideline package was ready, a semistructured process was followed for implementation. However, it soon became clear that provider awareness was lacking. To address this, a novel EBG Implementation Team was formed to ensure successful translation to practice. The team consisted of two physicians, two nurses, a QI expert, a data analyst, and an administrator with the support of the clinical chief and individual EBG champions. To guide the choice of implementation strategies, the group reviewed existing implementation science extensively and employed the strategies most likely to be effective given our local setting. Using the Pathman (awareness-agreement-adoption-adherence) model, we developed a structured process for implementation15 (figure 1). The implementation process proceeded in multiple iterations until compliance data showed that recommendations had been embedded into practice as evidenced by a stable improvement over many months to years. Strategies used to improve awareness of and adherence to EBG recommendations included presentations at physician and/or nursing meetings, use of posters in high traffic locations, development of binders with laminated copies of each guideline, one-on-one discussions with providers, web-based videos, pocket cards with algorithms and regular reports of performance to champions and to the entire division. Akenroye AT, et al. Emerg Med J 2015;0:1–9. doi:10.1136/emermed-2014-204363

Downloaded from http://emj.bmj.com/ on August 31, 2015 - Published by group.bmj.com

Original article

Figure 1 Process for implementation.

PROGRAMME EVALUATION Provider survey To assess the impact of the guideline programme, we developed a simple 14-question anonymous web-based survey (see online supplementary appendix A) to assess the overall impact of the EBG programme, adoption of recommendations and effectiveness of the implementation strategies 2 years after the initiation of the guideline programme. The survey was created by the improvement team and piloted by distributing to five providers for flow and ease of use. It was distributed 3 months after roll-out of the first eight EBGs, which was about a year after the first EBG was rolled out, to allow for a wash-in period of guideline use. Data were analysed using descriptive statistics for the 10 questions with either dichotomous or 4-point or 5-point Likert scale responses. Comments from the final four questions were collated and categorised into themes and feedback was given to the QI team during a presentation and to providers via email. Detailed survey results were also sent to the EBG champions. Akenroye AT, et al. Emerg Med J 2015;0:1–9. doi:10.1136/emermed-2014-204363

Development and use of quality measures To assess the clinical impact of the guidelines, three to eight quality measures including balancing measures for each were developed (see online supplementary appendix B). Candidate measures were suggested by the EBG champions and vetted for importance and feasibility by the EBG team. Efforts were made to balance quality measures across the framework of structure, process and outcome16 as well as Institute of Medicine domains.17 Due to the episodic nature of ED care, we chose returns to the ED resulting in hospital admission within 72 h of discharge as our primary outcome measure for each guideline.18 Detailed measurement plans were developed and served as comprehensive guides for data extraction (online supplementary appendix C shows a sample plan). Every week, the clinical chief, QI expert and data analyst met to discuss the best method to extract the data and to define the measures. We involved the respective EBG champions as appropriate. The process to achieving the most accurate data for some measures took months while others were relatively easy. 3

Downloaded from http://emj.bmj.com/ on August 31, 2015 - Published by group.bmj.com

Original article We monitored performance monthly using statistical process control charts. Control charts display and analyse time series data.19 These charts allowed a means of studying process variation and identifying when a process changed following an intervention.20–22 Unlike traditional statistical methods, control charts demonstrating healthcare improvement efforts23–28 can reflect in a relatively short time the impact of an intervention or occurrence on a process and reduce the chances of spuriously attributing a change to an intervention. Control chart rules help differentiate variation due to a ‘special cause’ such as an improvement project from that due to underlying random variation (‘common cause’).19 21 To assess the impact of a guideline on disparities in care, another measure of standardisation, we evaluated the proportion of patients with minor head injury from the different payer and racial/ethnic groups undergoing brain imaging before and after the Minor Head Trauma EBG was implemented. Whenever we observed a non-random or ‘special’ cause indicative of poorer performance, we gave booster doses of improvement strategies to reinvigorate the process, such as one-on-one discussions, emailing specific clinicians, addressing misconceptions at our weekly teaching meetings, awareness and campaigns, and putting up fresh poster reminders.

Individual physician performance Another crucial part of EBG programme analysis was the addition of individual performance measures to the standing annual individual performance report shared confidentially with each attending by the division chief. We used funnel plots to display provider to provider variation for selected measures. Funnel plots are a particularly illustrative way to show whether true variation is present when there are low numbers since a single patient could impact the event rate of a provider with low patient volume significantly.19 Using a variance comparison test, we tested for a difference in variation before and after guideline implementation. In these reports, individual physicians were able to visualise their performance in comparison to deidentified peers.

RESULTS Based on the EBG selection criteria previously discussed, 11 EBGs were implemented over a 1 year period (box 1).

Results of provider survey The survey used to evaluate the success of implementation had a response rate of 70% (n=60) for physicians and 50% (n=36) for nurses. Table 1 shows the results.

Box 1 List of guidelines developed

There was good self-reported adherence with 95% of physicians and 89% of nurses reporting using the EBG when caring for a patient with an EBG condition. Adherence was corroborated by data collected on the different measures.29 Atypical disease presentation, resident orders due to lack of awareness of guideline recommendations and requests by referring physicians were cited as common reasons for non-compliance (data available on request). Physicians found the on-line and posted algorithms very or somewhat helpful (78% and 79%, respectively) as effective implementation strategies and nurses found the posted algorithms helpful (69%). Physicians and nurses found, in over 90% of responses, that the programme improved the quality and standardisation of care. Overall, the majority of providers rated the guideline programme at the most favourable level on a 4-point Likert scale.

Performance on individual physician and other quality measures Our EBG programme led to changes in practice and/or significant reductions in resource utilisation and costs. We showed, using time series methodology, a decrease in resource utilisation (radiographs, viral testing and β-agonist therapy) for patients with bronchiolitis without a change in outcomes.29 In addition, we were able to decrease resource utilisation (blood testing and intravenous fluid) and increase rates of pregnancy testing in pubertal girls and ECG testing for patients with syncope.30 In a third publication demonstrating the effectiveness of the programme, we were able to decrease rates of brain imaging for patients with minor head injury, and at a rate faster than that of the national decline.31 Selected additional measures, where there were reductions in resource utilisation and/or improved practice, are shown in figures 2 and 3 (and see online supplementary appendices D and E). We also eliminated disparities in some areas. For instance, we found that prior to the roll-out of the minor head injury EBG, patients with minor head injury who were Caucasian and who had private insurance were more likely to have a brain CT scan in comparison to African Americans, Hispanics or children with public insurance. Through the EBG, we reduced the overall rate of head imaging and were able to eliminate these disparities (figure 3). The individual performance report was particularly helpful in ‘pulling’ outliers towards the group average, which is likely more reflective of the care appropriate for our patient cohort rather than individual preferences (figure 2). As balancing measures, there were no adverse events or increase in the rate of returns to admission within 72 h for any of these conditions after 3 years of initiation of the guideline programme. (The control chart for the croup EBG return rate appears in see online supplementary appendix F.)

Costs of the programme ▸ ▸ ▸ ▸ ▸ ▸ ▸ ▸ ▸ ▸ ▸ 4

Abscess Anaphylaxis Bronchiolitis Chest pain Croup Ectopic pregnancy Gastroenteritis Intussusception Minor head trauma Syncope Urolithiasis

The initial costs of developing or implementing the guideline were relatively small and entirely in kind. Most of the direct cost was for creating posters displaying algorithms or basic recommendations of each EBG or performance. Otherwise, costs were indirect such as the clinicians’ time spent in the initial development and maintenance of the algorithm. Use of guidelines resulted in decreases in resource utilisation (eg, syncope and minor head trauma) and quantitative analysis of cost savings are currently underway. However, it is important to note that while many of the guidelines recommended against testing or treatment, some advocated for certain evidence-based therapies, such as the use of corticosteroids in patients with Akenroye AT, et al. Emerg Med J 2015;0:1–9. doi:10.1136/emermed-2014-204363

Survey question

Physicians (n=60) n (%)

Nurses (n=36) n (%)

Aware of how to access EBG algorithms Use corresponding EBG when providing care to patient with the condition

58 (96) 57 (95)

32 (89) 32 (89)

Very/somewhat Effectiveness of implementation strategies: Email reminders Online algorithms Posted algorithms Discussions at weekly conference* General awareness of EBG in ED Extent to which guidelines Have improved quality of care Helped to standardise care Are useful teaching tools

Neutral

Not very/at all

Very/somewhat

Neutral

Not very/at all

9 (15) 3 (5) 8 (13) 6 (10) 5 (8)

9 (15) 10 (17) 5 (8) 6 (10) 7 (12)

26 19 25 – 24

7 (19) 6 (17) 4 (11) – 6 (17)

3 (8) 10 (28) 7 (19) – 5 (14)

0 (0) 1 (2) 4 (7)

2 (3) 0 (0) 0 (0)

33 (92) 33 (97) 32 (91)

0 (0) 1 (3) 3 (9)

3 (8) 0 (0) 0 (0)

No

Yes

No

56 (97)

2 (3)

30 (83)

5 (14)

9 (15) 50 (83) 12 (20) 22 (37) 0 (0) 5 (8)

51 10 48 38 60 55

13 (39) 16 (44) 15 (41) 14 (39) 0 (0) 0 (0)

22 (61) 20 (55) 21 (58) 22 (61) 36 (100) 36 (100)

42 47 47 46 47

(70) (78) (78) (79) (80)

58 (97) 59 (98) 56 (93) Yes

Have the EBGs changed/reinforced your practice? What are the major barriers to not following a guideline? Time constraint/busy ED/competing demands Atypical disease presentation Resident proceeded with ‘non-EBG’ recommendation Patient met exclusion criteria Don’t believe in guidelines Other How easy were the guidelines to adhere to? Physicians Nurses

Very easy 31 (52) 11 (31) Great

Bolded items correspond directly to survey questions. *Strategy was directed at physicians only. EBGs, evidence-based guidelines; ED, emergency departments.

48 (80)

(69)

Somewhat easy

Neither easy nor difficult

Slightly difficult

Very difficult

22 (37) 16 (45)

2 (3) 3 (8)

0 (0) 1 (3)

0 (0) 0 (0)

Good

Fair

Poor

Great

Good

Fair

Poor

11 (18)

1 (2)

0 (0)

20 (57)

15 (43)

0 (0)

0 (0)

Original article

Overall impression of the EBG programme?

(85) (17) (80) (63) (100) (92)

(72) (53) (69)

Downloaded from http://emj.bmj.com/ on August 31, 2015 - Published by group.bmj.com

Akenroye AT, et al. Emerg Med J 2015;0:1–9. doi:10.1136/emermed-2014-204363

Table 1 Provider survey

5

Downloaded from http://emj.bmj.com/ on August 31, 2015 - Published by group.bmj.com

Original article

Figure 2 Funnel plots showing individual physician rates, before and after guideline initiation, for: (A) ordering chest radiographs in patients with bronchiolitis; and (B) giving intravenous fluid to patients with gastroenteritis.

croup and the monitoring of patients with anaphylaxis for at least 4 h prior to discharge from the ED, and might have led to increased resource utilisation.

LESSONS We present here a practical process for developing an EBG programme in a paediatric ED. The major lessons from this project were: 1. A local guideline programme is helpful in improving the value of care. 2. Developing guidelines is never, in the strict sense, complete since new evidence would usually emerge. Guidelines will need to be updated as appropriate. 3. Implementation could be considered the most important and challenging phase of a guideline programme. Adequate support vis-à-vis personnel and materials, as well as a focus on most appropriate strategies given the local setting are crucial to successful implementation. 4. Newer initiatives will not necessarily jeopardise the success of other ongoing initiatives, if well synchronised. We created a robust yet streamlined process for guideline development. By using local experts to either adapt a national or develop a new guideline, we were able to leverage peer influence in driving adoption.2 32 Through this local effort, we were able to avoid barriers to guideline implementation such as incompatibility of recommendations with local values, lack of credibility of guideline developer(s) with end users and 6

recommendations not well integrated into the workflow.7 8 33 34 Local experts were able to develop guidelines that were end-user sensitive yet evidence-based. Peers were also able to offer their expert opinions of the recommendations to the local developers. This was helpful in improving the value, buy-in and compliance with the EBG. There were many challenges in algorithm development. Although we capitalised on existing resources to develop EBGs, considerable time was expended. One guideline leader estimated almost 40 h over a 6-month period for the development of the algorithm alone. The substantial time invested by the EBG team, EMR liaison and others was not measured. Furthermore, we learned from the survey that some nurse champions would have liked involvement earlier in the process and more guidance in understanding their role. Earlier and more directed nursing involvement and more generous use of wall displays may have improved awareness and adoption by nurses. Since EBGs are dynamic, with the possibility for new evidence emerging after guideline introduction,3 it is important to have a plan for keeping guidelines current. We have instituted an annual review process for the guideline owners. Implementation can be the most challenging aspect of the use of guidelines.7–12 We quickly learned that substantial effort was required for successful implementation in a complex environment with hundreds of providers. Strategies proven to be effective in guideline implementation were chosen.7 9 10 12 34–39 For instance, as much as possible we avoided passive strategies such Akenroye AT, et al. Emerg Med J 2015;0:1–9. doi:10.1136/emermed-2014-204363

Downloaded from http://emj.bmj.com/ on August 31, 2015 - Published by group.bmj.com

Original article

Figure 3 Rates of CT scanning for patients with minor head injury, (A) by insurance type; and (B) by racial/ethnic group.

as didactic sessions and instead used active strategies such as interactive sessions and one-on-one discussions.11 12 Different strategies were also effective at different stages of implementation: awareness-agreement-adoption-adherence.15 For example, the interactive sessions at the physicians’ weekly conference were successful in raising awareness of EBG recommendations as well as in increasing the likelihood of agreement with individual recommendations. However, to ensure adoption and adherence, supportive systems such as a computerised order entry, electronic prompts and feedback, were very important.2 Just as shown by previous studies,40 41 the survey revealed that the effectiveness of the implementation strategies varied between and within groups. For instance, most nurses found the posted algorithms very effective while most physicians generally thought the online algorithms were very effective. It is therefore Akenroye AT, et al. Emerg Med J 2015;0:1–9. doi:10.1136/emermed-2014-204363

important to tailor strategies to those most effective among a group. The use of multifaceted interventions is also important such that at least one would appeal to a provider.42 It was however particularly challenging to improve nursing engagement across the spectrum of activities in the ED and has not been limited to the EBG programme per se. Reasons for the low nursing response rate include, but are not be limited to, a large proportion of part time nurses, a change in hospital policy that eliminated reimbursement for meeting time and the preference for print or face-to-face contact as the primary mode of communication by most nurses, unlike most physicians in the department who used emails as their primary mode of communication. Furthermore, physicians have a weekly 4-h meeting that provides an opportunity to discuss important topics like EBGs with no equivalent nursing meeting where a 7

Downloaded from http://emj.bmj.com/ on August 31, 2015 - Published by group.bmj.com

Original article significant proportion of nurses are present. We continue to work with nursing leadership in how to improve protected time for nurses to attend meetings. We have also employed some innovative methods such as the ‘Education Tree’ whereby specific nurses well engaged with the EBG programme are assigned three to four nurses whom they will be responsible for in ensuring they are up to date with EBG recommendations. The high number of rotating residents also presents a challenge to guideline adherence. Ongoing training is underway. Interestingly, although potentially risky, we found a synergistic effect by rolling out the guidelines in quick succession. It is possible that the introduction of each successive guideline kept awareness of the EBG programme fresh for providers. The implementation of the guidelines did not extinguish improvements seen in concurrent projects in the department, such as improving care for patients with sepsis;43 44 rather we believe the EBG programme may have further reinforced the department’s improvement culture. It may be helpful to introduce guidelines that have similar recommendations and can benefit from such a synergistic effect such as reduction of utilisation of ionising radiation. The health information system of the hospital was crucial. We have the benefit of access to a robust data warehouse and were able to measure many components easily. Others were more challenging to measure and required chart review or measure revision to improve feasibility. One challenge unique to an ED is in identifying meaningful outcomes given the episodic nature of the care we provide. However there were significant opportunities to improve value —improving efficiency, equity and costs. Limitations of this effort include the possibility that guidelines were created in which local conventions trump evidence. However, we relied on the adaptation of national guidelines, when available, complete and thorough peer review, the academic strength of the physician and nurse leaders, and oversight by division leaders to mitigate this possibility.

2

3

4

5 6 7

8

9

10

11

12

13

14

15

16

CONCLUSION

17

We have shown that it is possible to develop a successful and valuable guideline programme in a complex setting as an ED. The keys to success of the programme were strong leadership support and local presence of guidelines, selection of motivated champions, development of practical process for guideline development and implementation, peer consensus among a highly academic group, and rigorous performance monitoring with frequent feedback to stakeholders. We believe that the fundamentals of this EBG programme may be adapted to other clinical settings.

18

Acknowledgements The authors thank the Chief of the Division, Dr Richard Bachur, the EBG Champions, Implementation Team, and all the emergency doctors and nurses for their dedication to the EBG project. The authors also thank Jonathan Finkelstein for his helpful comments on previous drafts of this manuscript and Stephanie Parver for assistance with manuscript preparation. Contributors ATA contributed to the conception and design, analysis and interpretation of data, drafted the initial manuscript, and approved the final manuscript submitted. AMS contributed to the conception and design, analysis and interpretation of data, reviewed and revised the manuscript, and approved the final manuscript submitted. Competing interests None declared. Provenance and peer review Not commissioned; externally peer reviewed. Data sharing statement There are no unpublished data for development and implementation.

REFERENCES 1

8

Richardson WS, Wilson MC, Nishikawa J, et al. The well-built clinical question: a key to evidence-based decisions. ACP Journal Club 1995;123:A12–13.

19

20 21

22

23

24

25

26 27 28 29

Chumpitazi CE, Barrera P, Macias CG. Diagnostic accuracy and therapeutic reliability in pediatric emergency medicine: the role of evidence-based guidelines. Clin Pediatr Emerg Med 2011;12:113–20. Gidding SS, Daniels SR, Kavey RE. Developing the 2011 integrated pediatric guidelines for cardiovascular risk reduction. Pediatrics 2012;129: e1311–e19. Grol R, Zwaard A, Mokkink H, et al. Dissemination of guidelines: which sources do physicians use in order to be informed? Int J Qual Health Care 1998;10:135–40. Eccles M, Mason J. How to develop cost-conscious guidelines. Health Technol Assess 2001;5:1–69. Bergman DA. Evidence-based guidelines and critical pathways for quality improvement. Pediatrics 1999;103(1 Suppl E):225–32. Bero LA, Grilli R, Grimshaw JM, et al. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. The Cochrane Effective Practice and Organization of Care Review Group. BMJ 1998;317:465–8. Cabana MD, Rand CS, Powe NR, et al. Why don’t physicians follow clinical practice guidelines? A framework for improvement. JAMA 1999;282:1458–65. Curran JA, Grimshaw JM, Hayden JA, et al. Knowledge translation research: the science of moving research into policy and practice. J Contin Educ Health Prof 2011;31:174–80. Gagliardi AR, Brouwers MC, Palda VA, et al. How can we improve guideline use? A conceptual framework of implementability. Implement Sci 2011;6:26. Grimshaw J, Eccles M, Thomas R, et al. Toward evidence-based quality improvement. Evidence (and its limitations) of the effectiveness of guideline dissemination and implementation strategies 1966–1998. J Gen Intern Med 2006;21(Suppl 2):S14–20. Grimshaw JM, Thomas RE, MacLennan G, et al. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Assess 2004;8:iii–iv. Brouwers MC, Kho ME, Browman GP, et al. AGREE II: advancing guideline development, reporting and evaluation in health care. J Clin Epidemiol 2010;63:1308–11. American Academy of Pediatrics Steering Committee on Quality Improvement and Management. Classifying recommendations for clinical practice guidelines. Pediatrics 2004;114:874–7. Pathman DE, Konrad TR, Freed GL, et al. The awareness-to-adherence model of the steps to clinical guideline compliance. The case of pediatric vaccine recommendations. Med Care 1996;34:873–89. Donabedian A. The quality of medical care. Science (New York, NY) 1978;200:856–64. America. IoMCoQoHi. Crossing the quality chasm: a new health system for the 21st Century. Washington DC: National Academy Press, 2001. Adekoya N. Patients seen in emergency departments who had a prior visit within the previous 72 h-National Hospital Ambulatory Medical Care Survey, 2002. Public Health 2005;119:914–18. Benneyan JC, Lloyd RC, Plsek PE. Statistical process control as a tool for research and healthcare improvement. Qual Saf Health Care 2003;12:458–64. Benneyan JC. Use and interpretation of statistical quality control charts. Int J Qual Health Care 1998;10:69–73. Benneyan JC. Statistical quality control methods in infection control and hospital epidemiology, part I: Introduction and basic theory. Infect Control Hosp Epidemiol 1998;19:194–214. Benneyan JC. Statistical quality control methods in infection control and hospital epidemiology, Part II: chart use, statistical properties, and research issues. Infect Control Hosp Epidemiol 1998;19:265–83. Benneyan JC, Villapiano A, Katz N, et al. Illustration of a statistical process control approach to regional prescription opioid abuse surveillance. J Addict Med 2011;5:99–109. Boggs PB, Hayati F, Washburne WF, et al. Using statistical process control charts for the continual improvement of asthma care. Jt Comm J Qual Improv 1999;25:163–81. Curran ET, Benneyan JC, Hood J. Controlling methicillin-resistant Staphylococcus aureus: a feedback approach using annotated statistical process control charts. Infect Control Hosp Epidemiol 2002;23:13–18. Finison LJ, Finison KS. Applying control charts to quality improvement. J Healthc Qual 1996;18:32–41. Laffel G, Luttman R, Zimmerman S. Using control charts to analyze serial patient-related data. Qual Manag Health Care 1994;3:70–7. Sellick JA Jr. The use of statistical process control charts in hospital epidemiology. Infect Control Hosp Epidemiol 1993;14:649–56. Akenroye AT, Baskin MN, Samnaliev M, et al. Impact of a bronchiolitis guideline on ED resource use and cost: a segmented time-series analysis. Pediatrics 2014;133:e227–34.

Akenroye AT, et al. Emerg Med J 2015;0:1–9. doi:10.1136/emermed-2014-204363

Downloaded from http://emj.bmj.com/ on August 31, 2015 - Published by group.bmj.com

Original article 30

31

32

33

34

35 36

Guse SE, Neuman MI, O’Brien M, et al. Implementing a guideline to improve management of syncope in the emergency department. Pediatrics 2014;134: e1413–21. Nigrovic LE, Stack AM, Mannix RC, et al. Quality improvement effort to reduce cranial CTs for children

The development and evaluation of an evidence-based guideline programme to improve care in a paediatric emergency department.

Care guidelines can improve the quality of care by making current evidence available in a concise format. Emergency departments (EDs) are an ideal sit...
3MB Sizes 3 Downloads 7 Views