560214

research-article2014

AJMXXX10.1177/1062860614560214American Journal of Medical QualityChen et al

Article

Partnering With VA Stakeholders to Develop a Comprehensive Patient Safety Data Display: Lessons Learned From the Field

American Journal of Medical Quality 1­–9 © The Author(s) 2014 Reprints and permissions: sagepub.com/journalsPermissions.nav DOI: 10.1177/1062860614560214 ajmq.sagepub.com

Qi Chen, MD, PhD1, Marlena H. Shin, JD, MPH1, Jeffrey A. Chan, BS1, Jennifer L. Sullivan, PhD1, Ann M. Borzecki, MD, MPH2,3,4, Michael Shwartz, PhD1,5, Peter E. Rivard, PhD1,6, Jonathan Hatoun, MD, MPH7*, and Amy K. Rosen, PhD1,4

Abstract Health care systems are increasingly burdened by the large numbers of safety measures currently being reported. Within the Veterans Administration (VA), most safety reporting occurs within organizational silos, with little involvement by the frontline users of these measures. To provide a more integrated picture of patient safety, the study team partnered with multiple VA stakeholders and engaged potential frontline users at 2 hospitals to develop a Guiding Patient Safety (GPS) tool. The GPS is currently in its fourth generation; once approval is obtained from senior leadership, implementation will begin. Stakeholders were enthusiastic about the GPS’s user-friendly format, comprehensive content, and potential utility for improving safety. These findings suggest that stakeholder engagement is a critical first step in the development of tools that will more likely be used by frontline users. Policy makers and researchers may consider adopting this innovative partnered-research model in developing future national initiatives to deliver meaningful programs to frontline users. Keywords engage frontline stakeholders, patient safety, quality improvement, tool development Patient safety continues to be an important national concern. Since the seminal 1999 Institute of Medicine report, To Err Is Human,1 numerous initiatives have been implemented to measure, report, and reduce medical errors. The Centers for Medicare & Medicaid Services currently reports selected patient safety measures on the Medicare Hospital Compare Web site for the Hospital Inpatient Quality Reporting Program.2 These include complication rates after hip/knee replacement, selected Agency for Healthcare Research and Quality (AHRQ) Patient Safety Indicators (PSIs), and hospital-acquired infection (HAI) rates collected by the Centers for Disease Control and Prevention’s National Healthcare Safety Network. Patient safety also is a top priority within the Veterans Health Administration (VA). The National Center for Patient Safety (NCPS) assigns patient safety managers to lead local safety activities at each of its 151 VA hospitals.3 These patient safety managers track safety events through collection and analysis of voluntary incident reports, which are used to guide quality improvement activities at individual hospitals.4 In addition, other types of safety measures and reports are disseminated widely

throughout the VA, including selected PSIs,5 HAIs, and hospital-acquired pressure ulcers (HAPUs).5-11 The VA Surgical Quality Improvement Program (VASQIP), formerly “NSQIP,” distributes rates of postoperative complications (ie, morbidity) and mortality to each VA hospital (Table 1).12 Despite the “top-down” nature of these initiatives, which originate from VA management directives and 1

VA Boston Healthcare System, Boston, MA Bedford VAMC, Bedford, MA 3 Boston University School of Public Health, Boston, MA 4 Boston University School of Medicine, Boston, MA 5 Boston University School of Management, Boston, MA 6 Suffolk University Sawyer Business School, Boston, MA 7 Boston Medical Center, Boston, MA *The author’s contribution to this work was made within the scope of his employment with the VA. 2

Corresponding Author: Qi Chen, MD, PhD, VA Boston Healthcare System, Center for Healthcare Organization and Implementation Research (CHOIR), 150 S Huntington Avenue, Boston MA 02130. Email: [email protected]

Downloaded from ajm.sagepub.com at WASHINGTON UNIV SCHL OF MED on November 17, 2015

2

Downloaded from ajm.sagepub.com at WASHINGTON UNIV SCHL OF MED on November 17, 2015

Definition Any hospital staff may file a report through their hospital-specific IR system Hospital infection control departments collect data

Data Collection

Hospital-Acquired Pressure Ulcers (HAPUs)  

VA Surgical Quality Improvement Program (VASQIP) complications  

Data Use

Data Source for Guiding Patient Safety Tool



VANOD



Hospital VASQIP nurses

IPEC

Patient safety managers generate Patient safety managers Hospital patient safety reports on a routine basis within oversee the data and managers and/or their hospitals (frequency varied use to guide quality quality managers between study sites) improvement (QI) Hospital infection control Hospital Infection Control IPEC departments report data to VA departments use the Inpatient Evaluation Center (IPEC) data to guide QI IPEC generates reports to compare   HAI rates among nationwide VA hospitals

Data Reporting

IPEC generates reports to compare Hospital patient safety PSI rates among nationwide VA managers, quality hospitals managers, and senior leadership review the IPEC reports, but usually do not use to guide QI A set of postoperative VASQIP nurses at each VASQIP nurses report data to VA Surgical Departments use complications occurring within VA hospital collect National Surgery Office (NSO) VASQIP data to guide 30 days after selected highdata based on chart NSO calculates hospital riskQI in surgery volume, complex surgeries review adjusted mortality and morbidity rates, and disseminates reports that allow comparing surgical quality across hospitals Pressure ulcers stage ≥2 acquired in Hospital nursing Nursing departments report data Nursing departments use hospital departments collect to the VA Nursing Outcomes VANOD data to guide data Database (VANOD) QI VANOD generates reports that allow comparison of HAPU rates across hospitals

Specific types of infections (ie, catheter-associated urinary tract infection, central line–associated bloodstream infection, ventilator  associated pneumonia, methicillinresistant Staphylococcus aureus) acquired in hospital Agency for Healthcare A set of 13 computerized algorithms IPEC calculates the Research and to detect potentially preventable rates of PSIs using Quality Patient patient safety events administrative data Safety Indicators (PSIs)

Hospital-Acquired Infections (HAIs)

Incident Reports (IRs) Voluntary reports on patient safety events and “near misses”

Measures

Table 1.  Descriptions of Hospital Patient Safety Measures.

3

Chen et al policies, patient safety reporting in the VA is not well coordinated. Numerous reports are distributed by various VA offices, while patient safety initiatives often occur within organizational silos. For example, patient safety managers use incident reports to guide quality improvement activities, while the chiefs of surgery rely on VASQIP reports because they are more familiar with each of these respective measures. The lack of a unified overview of patient safety may result in inaccurate estimates of safety events or missed hospital-wide safety problems.13,14 Furthermore, because most safety initiatives and reports emanate from the top, buy-in from potential users of these reports is frequently missing. As a result, some reported patient safety measures, such as PSIs, are not used for quality improvement. To bridge these organizational silos in patient safety measurement, the study team partnered with national, regional, and local stakeholders at 2 VA hospitals to develop a comprehensive patient safety data display—the Guiding Patient Safety (GPS) tool—that could provide a more meaningful and complete picture of safety events at individual hospitals. Based on a literature review15-18 and the study team’s prior work,19,20 the team hypothesized that the use of an alternative (“bottom-up”) approach to safety reporting (ie, incorporating the perspectives of local stakeholders into GPS report development) would promote strong engagement of VA stakeholders in the development process, lead to better understanding of a broad spectrum of safety measures, and enhance buy-in of the GPS tool from potential users. The purpose of this study is to describe the iterative development of the GPS tool and, in particular, to highlight the engagement of VA stakeholders in this process, which the study team believes is a critical first step toward implementing the GPS, and ultimately using the GPS to improve patient safety in the VA. The specific aims of this study are to (1) assess the types of safety measures used by each hospital for measurement, tracking, and quality improvement; (2) understand facilitators and barriers to current safety measurement; (3) develop, customize, and refine the GPS for each hospital based on local stakeholder feedback; and (4) evaluate the utility of the GPS for safety and quality improvement activities from the perspective of potential users. The innovative model of partnered-research described in this study may provide useful information to researchers who are developing new initiatives for quality improvement both in and outside the VA.

Methods Overview To develop and customize the GPS, the study team partnered with multiple stakeholders through a series of steps

at each hospital: (1) baseline interviews with local stakeholders to inform initial GPS development; (2) collection of patient safety data for incorporation into the GPS; (3) development and dissemination of an initial version of the GPS; (4) follow-up interviews with selected local stakeholders to obtain feedback on the GPS; (5) revision and customization of the GPS based on follow-up interview data; and (6) dissemination of revised versions of the GPS to each site. Figure 1 demonstrates the partnerships involved in these steps. Approval was obtained from the institutional review board at VA Boston Healthcare System.

Partnership With Key VA Stakeholders The local partners included 2 acute care VA hospitals in the Northeast that provide a range of surgical and medical care: an urban, highly complex, tertiary care teaching hospital, and a rural, intermediate-complex, tertiary care teaching hospital. Senior leadership at both hospitals was supportive of the study. The data necessary for the GPS were obtained directly from the patient safety or quality managers and VASQIP nurses at each site. At the national and regional levels, the partners provided approvals and support for the study. Specifically, NCPS funded this study through the Patient Safety Center of Inquiry on Measurement to Advance Patient Safety. NCPS also facilitated dissemination of draft versions of the GPS nationwide through teleconferences and distribution to other partners. The regional patient safety officer, quality management officer, and director also endorsed this study. Other VA national offices (eg, the VA’s Inpatient Evaluation Center [IPEC]) provided additional data needed for the GPS.

Checklist and Categorization of Patient Safety Measures To better understand the patient safety measures commonly used by VA hospitals, the study team conducted an inventory of the currently used measures. The team obtained several measures directly from 2 national VA reports: the VA Strategic Analytics for Improvement and Learning (SAIL)10 report and the Linking Knowledge and Systems (LinKS) report.11 Additional safety-related measures were found on the IPEC21 and the VA Support Service Center22 Web sites. To guide initial GPS development, the study team created a checklist of the most commonly reported measures and disseminated this to both hospitals prior to baseline interviews (online Appendix 1, available at http://ajmq. sagepub.com/supplemental). Using 2 classification systems (ie, the AHRQ Common Formats23 and the World Health Organization’s International Classification for Patient Safety [ICPS]),24 the study team grouped these

Downloaded from ajm.sagepub.com at WASHINGTON UNIV SCHL OF MED on November 17, 2015

4

American Journal of Medical Quality

Interview Protocol Development Overall study design and obtained supports approvals for this study

Separate interview protocols were developed for (1) baseline assessments of the current patient safety measurement activities and (2) follow-up interviews soliciting feedback on revisions to guide subsequent GPS iterations. Open-ended interview questions in specific content areas were derived based on the literature related to diffusion of innovation within an organization21 or adapted from prior studies.19,20

Developed the patient safety measure checklist Developed the standardized classification for safety events Developed the baseline interview protocol Conducted baseline interviews* Collected FY12Q4FY13Q1 safety data Developed the first GPS using FY12Q4-FY13Q1 data and disseminated to facilities Developed the first follow-up interview protocol +

Quarterly progress report to NCPS

Baseline Interviews Quarterly progress report to NCPS

Conducted the first followup interviews Collected FY13Q2FY13Q3 safety data Developed the second GPS using FY13Q1-FY13Q3 data and disseminated to facilities Developed the second follow-up interview protocol Conducted the second follow-up interviews+ Collected FY13Q4 safety data Developed the third GPS using FY13Q1-FY13Q4 data and disseminated to facilities

Quarterly progress report to NCPS Presented and shared the GPS to nation-wide regional patient safety officers and hospital patient safety managers Quarterly progress report to NCPS

Developed the third follow-up interview protocol Conducted the third followup interviews¶

Follow-Up Interviews

Collected FY14Q1 safety data Developed the final (fourth) GPS using FY13Q1-FY14Q1 data

Baseline interviews involved more stakeholders than follow-up interviews because the goal was to inform the initial GPS tool development from multiple perspectives. The study team conducted 1-hour, semistructured telephone interviews with 2 patient safety managers, the quality manager, a nurse manager on the hospital’s quality management team, the VASQIP nurse, and the chief of staff at one hospital. At the other hospital, the team interviewed the patient safety manager, the quality manager, the VASQIP nurse, an analyst on the hospital’s quality and safety management team, and the nurse executive. During the interviews, the team confirmed which measures on the checklist were currently used and whether these were used to guide quality improvement. Local stakeholders also were asked to describe any facilitators or barriers to safety measurement. Finally, the study team reviewed each hospital’s current method for disseminating patient safety data and obtained their input on what an ideal data display would include.

Quarterly progress report to NCPS

Figure 1.  Partnerships with national, regional, and local stakeholders throughout the study phases.

Abbreviations: NCPS, VA National Center for Patient Safety; FY12Q4, Fiscal Year 2012 Quarter 4, July 2012-September 2012; FY13Q1, Fiscal Year 2012 Quarter 1, October 2012-December 2012; FY13Q2, Fiscal Year 2013 Quarter 2, January 2013-March 2013; FY13Q3, Fiscal Year 2013 Quarter 3, April 2013-June 2013; FY13Q4, Fiscal Year 2013 Quarter 4, July 2013-September 2013; FY14Q1, Fiscal Year 2014 Quarter 1, September 2013-December 2013.

measures into 13 categories: 8 from the AHRQ Common Formats, 2 from the ICPS, and 3 miscellaneous (those not fitting into either classification system) (online Appendix 2, available at http://ajmq.sagepub. com/supplemental).

A total of 3 rounds of 1-hour, semistructured, telephone follow-up interviews were conducted with selected local stakeholders over a 1-year period. The study team interviewed each hospital’s patient safety and quality managers because they would be most likely to use the GPS for routine safety/quality monitoring and improvement. Each follow-up interview was designed to obtain stakeholders’ suggestions on how to improve the GPS. In the second round of follow-up interviews, additional questions were asked regarding how the GPS facilitates performance measurement and safety/quality improvement activities. In the third round of follow-up interviews, based on patient safety and quality managers’ suggestions, additional feedback was obtained on the design and the utility of the GPS from other local stakeholders and potential users of the GPS. At one hospital, these stakeholders included a program analyst in quality management and the chief of infection control; at the other hospital, feedback was obtained from the VASQIP nurse, the associate

Downloaded from ajm.sagepub.com at WASHINGTON UNIV SCHL OF MED on November 17, 2015

5

Chen et al chief nurse for quality and performance, and the chief of acute nursing services.

GPS Development and Revisions Quarterly patient safety data were obtained for the GPS from hospital patient safety managers (incident reports), hospital VASQIP nurses (VASQIP complications), and IPEC (PSIs and HAIs) beginning in July 2012 for the first version of the GPS, while quarterly HAPU data were obtained from the VHA Support Service Center Web site beginning in October 2012 for the GPS after its first revision. The first GPS was developed based on available patient safety data, stakeholders’ suggestions from the baseline assessment, and the classification systems described previously. After each follow-up interview, the study team compiled new quarterly data and revised the GPS according to the feedback received. The team incorporated as much feedback as possible that was obtained from stakeholders when data were available. To date, the team has completed the fourth generation of the GPS, which displays one year of data (Fiscal Year 2013, October 2012-December 2013).

Data Analysis The study team reviewed the detailed notes taken during all interviews and identified major themes using content analysis. The team analyzed the evidence for a priori specified content areas19-21 to help guide the continuous development and revision of the GPS (eg, development of an ideal data display, leadership involvement in patient safety). Because this study aimed to describe the process used to develop the GPS tool, specific outcomes were selected that could measure the success of this development process. These included buy-in from potential users, engagement of stakeholders throughout the development process, and perceptions of the GPS tool.

Results Aim 1: Assess the Types of Safety Measures Used by Each Hospital for Measurement, Tracking, and Quality Improvement Stakeholders reported that the checklist was comprehensive and accurately reflected the measures available for tracking and quality improvement purposes. Despite the plethora of measures reported and available to hospitals, incident reports were by far one of the most widely used methods to detect safety events at each hospital. Patient safety managers and quality managers at both sites noted routine use of incident reports for monitoring safety and targeting areas for quality improvement. Stakeholders at both hospitals reported that administrative data-based

measures, such as the PSIs, were rarely used, despite awareness that these measures were available in the LinKS and SAIL reports. Although routinely reviewed by VASQIP nurses, VASQIP outcomes were not frequently used by patient safety managers and quality managers at either site because these were considered to be under the purview of the surgery department. However, some collaboration occurred when needed between the VASQIP nurses, patient safety managers, and quality managers. See Table 1 for additional descriptions of each hospital’s patient safety measures.

Aim 2: Understand Facilitators and Barriers to Current Patient Safety Measurement Stakeholders at both hospitals reported several facilitators to patient safety measurement. For example, patient safety and quality management departments were well integrated, and most employees in these areas were very involved in patient safety measurement activities. Stakeholders felt hospital leadership was invested in patient safety, quality management, and VASQIP quality improvement activities. However, there also were several barriers noted to safety measurement. Manual data collection and reporting processes were often seen as burdensome, and stakeholders reported being overwhelmed by the number of reports they had to review. Additional reported barriers included difficulties with report dissemination processes, limited data accessibility to relevant users, poor presentation of data displays, and lack of trust in the validity of the data, particularly the PSIs.

Aim 3: Develop, Customize, and Refine the GPS for Each Hospital Stakeholders provided many initial recommendations for the type of content and format they wanted to see depicted in an ideal patient safety data display (Table 2). Suggestions focused on simple graphic displays (eg, bar charts, pie charts), trends in event counts, use of color schemes to portray different measures, and customization to meet users’ needs. Based on this feedback, the study team developed the GPS as an Excel workbook consisting of 7 tabs with a wide range of safety measures derived from multiple data sources. This format provided stakeholders with a tool that could easily be disseminated to others. Although the measures reported in the GPS were consistent across both hospitals, the incident report section of the GPS was customized for each hospital because of differences in how they categorized these events. Table 3 provides a description of the information included in each of the sections in the GPS.

Downloaded from ajm.sagepub.com at WASHINGTON UNIV SCHL OF MED on November 17, 2015

6

American Journal of Medical Quality

Table 2.  Recommendations for Ideal Data Display Based on Baseline Interviews. Tailor the report for specific audience •• Respond to needs of customers by groups •• Define what data sets are important Show a complete picture using data from multiple sources in one display •• Departments (usually) generate reports in silos •• Be able to readily click on individual types/sources of events to see the details •• Have Web links to all measures in one display Present data visually in a simple and graphic format to increase use •• Bar graphs •• Tables with trend lines •• Color-code the data Allow for “drill down” •• Type of incidents, units, wards, specialty, date, location, and individual patient information Allow for comparisons within and across hospitals •• Show comparative data with the ability to drill down to multiple levels (eg, national, VISN, hospital, and unit levels) Present data display in a format that can be disseminated easily •• Helpful to present data in electronic format Abbreviation: VISN, Veterans Integrated Service Network.

As for content, the study team categorized safety events using the AHRQ Common Formats and ICPS, included definitions of each measure set, and examined duplicate events captured by different data sources to avoid potential double-counting. For the first version of the GPS, safety measures were incorporated that represented ones that were used frequently (eg, incident reports) and those that were used less often (eg, PSI counts) in order to familiarize selected local stakeholders (ie, patient safety and quality managers) with standardized measures used in national performance measurement. Overall, the patient safety and quality managers were enthusiastic about the GPS, noting that the format was “user-friendly” and “easy to navigate,” while the content was “comprehensive” and “understandable.” They reported that the GPS represented a “high-level synopsis of patient safety” at their hospital, pulling together multiple data sources with “critical information in one place.” The GPS provided them with more knowledge about different measures and a more comprehensive picture of their hospital’s patient safety than they had previously. Although some of the changes requested by selected local stakeholders during the follow-up interviews were stylistic (eg, font size, colors), some suggestions resulted in major changes to the GPS (Table 3). New data for comparative purposes were added; these included measures of mortality and HAPUs, observed PSI rates (as opposed to counts), and national/regional data on PSIs, HAIs, and HAPUs to be able to more easily compare a given hospital’s performance to similar hospitals. Additional graphics, including trend lines for specific measures, also were integrated. However, some of their

suggestions (eg, comparing with similar peer groups) could not be incorporated because of unavailable data. Local stakeholders reported that the changes made to the GPS helped improve their understanding of the content and graphic displays. For example, they initially requested that a bar graph, displaying the same data as the spider graph, be added to Tab 2 of the GPS because they were not familiar with interpreting spider graphs. However, by the second follow-up interview, they felt more comfortable with interpreting the spider graph based on the instructions that had been added in the revised GPS. In the third round of interviews, all stakeholders confirmed that they felt comfortable explaining the content of the GPS (including the aforementioned “difficult” spider graph) to others.

Aim 4: Evaluate the Utility of the GPS for Safety and Quality Improvement Activities The patient safety managers and quality managers acknowledged the GPS’s potential utility for safety and quality improvement activities. Specifically, they felt that the GPS could be useful for benchmarking, examining trends in safety events, and pinpointing areas in which further “drill down” was necessary. By “drill down” stakeholders meant the recognition of a problem in a larger category that could be explored in more depth with the additional data available in the GPS. The GPS also could be useful in targeting quality improvement activities because it provided easy access to information from multiple data sources that otherwise may not have been aggregated or even seen at all (eg, the VASQIP complications). Patient safety managers also became increasingly

Downloaded from ajm.sagepub.com at WASHINGTON UNIV SCHL OF MED on November 17, 2015

7

Chen et al Table 3.  Descriptions of the Content and Format of the First, Second, and Third Versions of the GPS Tool. Tab 1

Name Introduction

Introduction to the GPS Instructions on how to use the tool (eg, how to print the GPS, how to navigate between tabs, how to read comments)

Agency for Healthcare Research and Quality (AHRQ) Common Formats

An overview of the hospital’s patient safety events (ie, the overall count of safety events, presented as a table and a trend line) The counts with percentages of specific type of events in standardized categories, presented as tables, trend lines, and spider graph

Sources of Event Detected

The overall count of safety events by data source presented as a table and a trend line The count and percentages of events captured by a specific data source presented as tables and trend lines The overall count and the specific count of 13 selected PSI events presented as a table

              2

          3   4

Patient Safety Indicators (PSIs)

    5

VA Surgical Quality Improvement Program (VASQIP)

6

Incident Reports

The overall count and the specific count of VASQIP postoperative complications presented as a table and a trend line The overall count and the specific count of incident reports presented as a table and a trend line



7

Hospital-Acquired Infections (HAIs)

The overall hospital count of HAIs presented as a table The specific count of catheter-associated urinary tract infection, central line–associated bloodstream infection, ventilatorassociated pneumonia, and methicillin-resistant Staphylococcus aureus infections presented as a table

Hospital-Acquired Pressure Ulcer stage 2 and higher (HAPUs)

Not in the first GPS

    8  

Major Changes in Second and Third Versions of the GPS

First Version of the GPS

   

Added external links to Web sites to explain various measures: •• AHRQ homepage •• AHRQ Common Formats •• National Center for Patient Safety (NCPS) •• VA Inpatient Evaluation Center (IPEC) •• VASQIP home page •• VHA Support Service Center (VSSC) •• VA Nursing Outcomes Database (VANOD) Elaborated on the definition of the categories in the classification system Added subcategories under “Other” to elaborate the events grouped into this category Added “Death” captured in VASQIP mortality data and Incident Reports Elaborated instructions on how to read the spider graph Added a bar graph to complement the spider graph Added a separate trend line graph for lowfrequency events No major changes   Added the counts of eligible patients and the observed rates of the individual PSIs Added national and regional data comparative data Added figures to present PSI comparative data Added VASQIP mortality data for comparative and benchmarking purposes Added a table and trend lines for selected incident reports in the GPS for hospital A Added “drill-down” function to unit-level incident reports data in the GPS for hospital B Added “drill-down” function to unit-level HAI data Added national and regional data comparative HAI data Added trend lines for total and specific types of HAIs Added a new Tab 8 for HAPUs Added “drill-down” function to unit-level HAPU data Added national and regional data comparative HAPU data Added a trend line for total HAPUs

Abbreviations: AHRQ, Agency for Healthcare Research and Quality; GPS, Guiding Patient Safety; VA, Veterans Affairs; VASQIP, VA Surgical Quality Improvement Program; VHA, Veterans Health Administration.

interested in other safety measures with each iteration of the GPS tool, including some of the administrative databased measures used for national and regional-level benchmarking (ie, PSIs). Other local stakeholders who were interviewed similarly agreed that the GPS could be useful in various ways.

For example, one stakeholder mentioned that the GPS could be used to improve the transparency of their performance and guide quality improvement activities; another stakeholder suggested that the GPS could be used to track performance, especially when evaluating the effect of an intervention on quality and safety performance over time.

Downloaded from ajm.sagepub.com at WASHINGTON UNIV SCHL OF MED on November 17, 2015

8

American Journal of Medical Quality

Discussion This study describes the development of the GPS tool through an iterative process that engaged a range of VA stakeholders at 2 VA hospitals. This innovative model of partnered-research ensured the success of the development of the GPS tool by providing strong buy-in from potential users of the GPS, continued engagement of stakeholders throughout the development process, and their positive feedback on the GPS. The GPS was developed through an iterative process, allowing potential users to actively engage in tool development at each step along the way (an element critical for tool development).17,18 Specifically, the study team encouraged ongoing stakeholder feedback in order to capture new and more refined suggestions after each revised version of the GPS was disseminated to the local audience. Presenting data in different ways provided users with a better understanding of graphic displays they were previously unfamiliar with (eg, spider graphs) and numerical presentations (eg, PSI rates). It also increased their interest in using measures that are often used to facilitate benchmarking, such as administrative databased measures like the PSIs. The GPS was purposely developed using a “bottomup” approach based on partnership with local patient safety and quality managers. This enabled the study team to address some of the safety measurement barriers mentioned in the baseline interviews, such as stakeholders feeling overwhelmed by too many reports. It also allowed for the incorporation of the unique needs of individual hospitals and for customization of the GPS in order to maximize the tool’s usefulness at each hospital. Frequently, “top-down” initiatives do not incorporate the perspectives of local stakeholders, and although they may lead to widespread dissemination, actual implementation is often much less successful.22-25 By conducting 4 rounds of interviews, the study team was able to obtain enough feedback with which to finalize the GPS tool. However, prior to actual implementation of the tool, the team plans to obtain additional feedback and buy-in of the GPS tool from senior leadership at each of the participating hospitals. This feedback and support is necessary if hospitals are to successfully transfer production of the GPS from its development phase to routine use in operations. Strong support from senior leadership will be required, particularly for those smaller hospitals that may lack the necessary resources for the transition, such as manpower to support the maintenance of the GPS tool. Once the GPS is actually implemented, the success of the GPS tool in improving patient safety will be assessed. Although the findings of this study are limited to experiences at 2 VA hospitals and interviews with a relatively

small number of stakeholders, they are consistent with the literature that discusses the importance of engaging stakeholders in any health care setting.17,18 Furthermore, although some of the specific findings only pertain to the VA, the lessons learned should be generalizable to other health care settings as well, such as the importance of applying a “bottom-up” approach to obtain buy-in from potential users in tool development.

Conclusion The experiences thus far support the underlying hypothesis of this study: engaging with stakeholders can lead to the development of tools that are more likely to be understood, supported, and ultimately implemented by frontline users. Policy makers and researchers who are developing national initiatives should actively engage both local and national stakeholders in order to deliver meaningful programs that are likely to be implemented in real-world practices. Acknowledgments Special thanks to: Kamal Itani, MD, and Steven Simon, MD, MPH, for leadership on the Patient Safety Center of Inquiry; Enzo Yaksic, BS, for administrative support and literature review; and Mary Greenan, MPH, for literature review.

Declaration of Conflicting Interests The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Department of Veterans Affairs, Veterans Health Administration, National Center for Patient Safety, the Patient Safety Center of Inquiry on Measurement to Advance Patient Safety, #XVA 68-023.

References 1. Institute of Medicine. To Err Is Human: Building a Safer Health System. Washington, DC: National Academies Press; 1999. 2. Centers for Medicare & Medicaid Services. Medicare Hospital Compare. http://www.medicare.gov/hospitalcompare/search.html?AspxAutoDetectCookieSupport=1. Accessed October 28, 2014. 3. US Department of Veterans Affairs. VA National Center for Patient Safety. http://www.patientsafety.va.gov/. Accessed March, 2012. 4. Veterans Health Administration. VHA national patient safety improvement handbook (VHA HANDBOOK 1050.01).

Downloaded from ajm.sagepub.com at WASHINGTON UNIV SCHL OF MED on November 17, 2015

9

Chen et al http://www1.va.gov/vhapublications/ViewPublication. asp?pub_ID=2389. Published March 4, 2011. Accessed October 28, 2014. 5. Agency for Healthcare Research and Quality. Patient Safety Indicators overview. http://www.qualityindicators. ahrq.gov/modules/psi_overview.aspx. Accessed March, 2014. 6. US Department of Veterans Affairs. Public health surveillance and research. http://www.publichealth.va.gov/about/ pubhealth/surveillance-research.asp. Accessed March 2014. 7. Veterans Health Administration. Prevention of pressure ulcers (VHA HANDBOOK 1180.02). http://www1.va.gov/ vhapublications/ViewPublication.asp?pub_ID=2422. Published July 1, 2011. Accessed October 28, 2014. 8. US Department of Veterans Affairs. VA Hospital Compare. http://www.hospitalcompare.va.gov/. Accessed December 15, 2010. 9. Department of Veterans Affairs, Veterans Health Administration. 2010 VHA facility quality and safety report. http://www.va.gov/health/docs/HospitalReportCard2010. pdf. Accessed December 3, 2011. 10. Veterans Health Administration. Strategic analytics for improvement and learning (SAIL). http://www.hospitalcompare.va.gov/docs/SAILData.pdf. Accessed October 28, 2014. 11. Veterans Health Administration. LINKS: LINking Knowledge and Systems. http://www.hospitalcompare.va.gov/reports/ VISN22_LinKS.pdf. Accessed October 28, 2014. 12. Khuri SF, Daley J, Henderson W, et al. The Department of Veterans Affairs’ NSQIP: the first national, validated, outcome-based, risk-adjusted, and peer-controlled program for the measurement and enhancement of the quality of surgical care. National VA Surgical Quality Improvement Program. Ann Surg. 1998;228:491-507. 13. Bates DW, Evans RS, Murff H, Stetson PD, Pizziferri L, Hripcsak G. Detecting adverse events using information technology. J Am Med Inform Assoc. 2003;10:115-128. 14. Murff HJ, Patel VL, Hripcsak G, Bates DW. Detecting adverse events for patient safety research: a review of current methodologies. J Biomed Inform. 2003;36: 131-143.

15. Boxwala AA, Dierks M, Keenan M, et al. Organization and representation of patient safety data: current status and issues around generalizability and scalability. J Am Med Inform Assoc. 2004;11:468-478. 16. Levtzion-Korach O, Frankel A, Alcalai H, et al. Integrating incident data from five reporting systems to assess patient safety: making sense of the elephant. Jt Comm J Qual Patient Saf. 2010;36:402-410. 17. Guise JM, O’Haire C, McPheeters M, et al. A practicebased tool for engaging stakeholders in future research: a synthesis of current practices. J Clin Epidemiol. 2013;66: 666-674. 18. O’Haire C, McPheeters M, Nakamoto E, et al. Engaging Stakeholders to Identify and Prioritize Future Research Needs (Report No. 11-EHC044-EF). Rockville, MD: Agency for Healthcare Research and Quality; 2011. 19. Rosen A. Validating the patient safety indicators in the VA: a multifaceted approach. http://www.hsrd.research.va.gov/ research/abstracts.cfm?Project_ID=2141698259. Accessed October 28, 2014. 20. Shin M. How to interpret and use patient safety indicator (PSI) reports. http://www.hsrd.research.va.gov/research/ abstracts.cfm?Project_ID=2141701531. Accessed October 28, 2014. 21. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82:581-629. 22. Blevins D, Farmer MS, Edlund C, Sullivan G, Kirchner JE. Collaborative research between clinicians and researchers: a multiple case study of implementation. Implement Sci. 2010;5:76. 23. Chaudoir SR, Dugan AG, Barr CH. Measuring factors affecting implementation of health innovations: a systematic review of structural, organizational, provider, patient, and innovation level measures. Implement Sci. 2013;8:22. 24. Klein KJ, Sorra JS. The challenge of innovation implementation. Acad Manage Rev. 1996;21:1055-1080. 25. Nilsen P, Stahl C, Roback K, Cairney P. Never the twain shall meet?—A comparison of implementation science and policy implementation research. Implement Sci. 2013;8:63.

Downloaded from ajm.sagepub.com at WASHINGTON UNIV SCHL OF MED on November 17, 2015

Partnering With VA Stakeholders to Develop a Comprehensive Patient Safety Data Display: Lessons Learned From the Field.

Health care systems are increasingly burdened by the large numbers of safety measures currently being reported. Within the Veterans Administration (VA...
421KB Sizes 0 Downloads 4 Views