531619 research-article2014

JAHXXX10.1177/0898264314531619Journal of Aging and HealthIbrahim et al.

Article

Use of Quality Indicators in Nursing Homes in Victoria, Australia: A Cross-Sectional Descriptive Survey

Journal of Aging and Health 2014, Vol. 26(5) 824­–840 © The Author(s) 2014 Reprints and permissions: sagepub.com/journalsPermissions.nav DOI: 10.1177/0898264314531619 jah.sagepub.com

Joseph E. Ibrahim, PhD, FRACP1, Liam Chadwick, BEng, MSc, PhD2, Aleece MacPhail, BA, BSc3, Linda McAuliffe, BBSc (Hons), MPsych4, Susan Koch, BA, PhD5, and Yvonne Wells, MPsych, PhD4

Abstract Objective: This study aimed to characterize the use of mandated quality indicators (QIs) in public sector nursing homes by describing their adherence to established principles of measurement and whether nursing homes respond to QI data to improve care. Method: Data were collected from a descriptive cross-sectional quantitative study using a confidential survey questionnaire distributed electronically to senior staff in all public sector nursing homes in Victoria, Australia. Results: Staff from 113 of 196 facilities completed the survey (58%). Adherence to principles of measurement was suboptimal, with variation in applying QI definitions and infrequent random audits of data (n = 54, 48%). QI data triggered reviews of individual residents 1Monash

University, Melbourne, Victoria, Australia University of Ireland, Galway, Ireland 3Ballarat Health Services, Victoria, Australia 4La Trobe University, Melbourne, Victoria, Australia 5RDNS Research Institute, St Kilda, Victoria, Australia 2National

Corresponding Author: Joseph E. Ibrahim, PhD, FRACP, Professor, Department of Forensic Medicine, Victorian Institute of Forensic Medicine, Monash University, 57-83 Kavanagh Street, Southbank, Victoria 3006, Australia. Email: [email protected]

Ibrahim et al.

825

(62%-79%), staff practice (44%-65%), and systems of care (45%-55%). Most facilities (58%-75%) reported that beneficial changes in care occurred as a result of using QIs. Discussion: QI performance data are positively received and used to improve care. Standardization of data collection, analysis, and reporting should strengthen the program’s utility. Keywords quality indicators, data use, improving care, residential aged care, nursing homes

Introduction Like other Western nations, Australia has an aging population. Demographic shift and increased prevalence of chronic diseases pose significant challenges to health care, including the delivery of high-quality long-term care (Access Economics, 2010; Borowski & MacDonald, 2007; Boyd et al., 2005; British Geriatrics Society, 2011). In 2009, 158,885 older Australians were living in long-term care at 2,783 nursing homes,1 at an annual government expenditure exceeding Aus$7.3 billion (Australian Productivity Commission, 2011). Services vary widely in terms of funding, clinical populations, assessment processes, and quality assurance systems (Bruen, 2005). Given the cost, organizational complexity, and potential for adverse events, the utmost attention must be paid to quality of care of this older population (British Geriatrics Society, 2011; Courtney, O’Reilly, Edwards, & Hassall, 2009). Optimal health and aged care service delivery requires objective measurement of the processes and outcomes of care (British Geriatrics Society, 2011; Organisation for Economic Co-operation and Development [OECD]/ European Commission, 2013; Panzer et al., 2013). The use of quality indicators (QIs) for this purpose is well-established in the acute care setting (Panzer et al., 2013). QIs are measurable elements of the process or outcomes of care that are utilized to bring attention to issues that need further investigation or to alert staff to possible opportunities for improvement; they are not direct measures of performance (Giuffrida, Gravelle, & Roland, 1999; Sheldon, 2005). Globally, aged care is only just beginning to consider QI use (OECD/ European Commission, 2013). The United States leads internationally, with system-wide use of standard QI in nursing homes (the Minimum Data Set) that have been implemented nationally over the past decade (Lin & Kramer, 2013). Early data suggest that the program has improved patient care (Tsan, Davis, Langberg, & Pierce, 2007), however, there is a lack of information on exactly how the Minimum Data Set is used to influence change (Lin &

826

Journal of Aging and Health 26(5)

Kramer, 2013). Other OECD and EU countries are increasingly adopting standardized assessment tools; however, quality and availability of data remain limited, and systemic reporting is still developing in most countries (OECD/European Commission, 2013). Box 1. Public Sector Residential Aged Care Services (PSRACS) Quality Indicators Project. The Public Sector Residential Aged Care Services (PSRACS) quality indicators (QIs) project is intended to develop and introduce practical, useful measures for aged care in Victoria, Australia (Victorian Government Department of Human Services, 2007). These measures are to be used to assist services to monitor and improve care with a wide range of potential uses and benefits by ensuring •• •• •• •• •• ••

targets or standards are being met, services to residents are improving, processes are working well, improvements are occurring as planned, improvements are sustained over time, any changes that need to be made to policies and procedures are occurring, and •• any additional improvements are made. Unlike most other Australian state governments,Victoria remains a major provider of residential aged care services. PSRACS are focused at small rural communities whose specialist care needs are not being met by other providers. PSRACS are operated under the regulation and funding mechanisms of the Commonwealth government. Since July 2006, PSRACS have been participating in a program that collects, reports, and benchmarks data on a set of QIs that sought to go beyond the minimum Commonwealth aged care accreditation standards.The indicators used are as follows: QI 1—Prevalence of stage 1 to 4 pressure ulcers QI 2—Prevalence of falls and fall-related fractures QI 3—Incidence of use of physical restraint QI 4—Incidence of residents using nine or more different medicines QI 5—Prevalence of unplanned weight loss All PSRACS are required to collect, record, and report QI data to the Victorian Government Department of Human Services on a quarterly basis with reference to the Resource Manual for Quality Indicators in Public Sector Residential Aged Care Services (Victorian Government Department of Human Services, 2007). The “Accreditation Agency” is an Australian company appointed by the Department of Health and Ageing as the accreditation body under the Aged Care Act 1997 for 3 years, reappointed until June 2013.The Resource Manual provides guidelines for the definition, collection, measurement, and reporting of the quality indicators.

Ibrahim et al.

827

In Victoria, Australia, the Public Sector Residential Aged Care Services (PSRACS) QI Project was developed by the Department of Human Services (DHS) to provide practical measures to monitor and improve quality of care in nursing homes (Victorian Government DHS, 2007; Box 1). The Commonwealth Department of Health and Ageing is also developing a suite of QIs for nursing homes with a plan to publish these on a public website from mid-2014 (National Aged Care Alliance, 2013). There are limited data on the validity and efficacy of QI as a quality improvement measure in nursing homes (Courtney, O’Reilly, Edwards, & Hassall, 2010; O’Reilly, Courtney, & Edwards, 2007). Debate also exists over QI data collection and data management, the nature and extent of reporting performance, and the extent of response or change to practice as a result of the QI information (Hughes et al., 2004; Mainz, 2003; Mor, Angelelli, et al., 2003). A better understanding of the implementation of the PSRACS QIs project and its effect on clinical practice in nursing homes could offer insight into possibilities for improving quality of care in this population (O’Reilly et al., 2007).

Aim To characterize the use of QIs in nursing homes by describing (a) level of adherence to principles of measurement and (b) whether nursing homes are responding to QI data to improve quality of care.

Method Study Design and Setting A cross-sectional, confidential electronic questionnaire survey was distributed to all public sector–funded nursing home providers operating in metropolitan, regional, and rural areas of Victoria, Australia, as of July 2008. Approval for the project was granted by the La Trobe University, Faculty of Health Sciences, Human Ethics Committee in accordance with the ethical rules stated in the Declaration of Helsinki (World Medical Association, 2008).

Study Instrument Because no validated measures were available, we developed our own survey tool to collect information on procedures that nursing homes use to collect data for QI reporting. The tool was designed in consultation with experts in

828

Journal of Aging and Health 26(5)

each of the five QI domains and with reference to the QI definitions and instructions for data collection outlined in the Resource Manual (Victorian Government DHS, 2007). The survey questions were developed drawing on guidelines developed by Monash University (McNeill, Evans, Crammond, & Cameron, 2008). Due to logistic constraints, we did not assess the test–retest reliability of the survey items.

Survey The survey consisted of 51 close-ended questions in seven sections: respondent details (three questions), facility characteristics (3 questions), data gathering and collection (8 questions), data analysis (4 questions), data reporting and interpretation (2 questions), response to reporting of indicator data (6 questions), and details on data collection methods for the five specific QIs (25 questions). These items on methods focused on sources of information (e.g., incident reporting or nursing records), whether staff had difficulty with the definitions given in the Resource Manual, and whether assessments in each domain complied with instructions in the Resource Manual (e.g., in defining various actions as restraint, and inclusion criteria used to count residents’ medications). Questions were multiple choice, categorical, or Likerttype (using five-point rating scales ranging from “1 = always” to “5 = never” or “1 = definitely yes” to “5 = definitely not”). The survey was pilot tested by the project team for face validity to ensure relevance and comprehension prior to the development of the final electronic version.

Survey Distribution and Analysis A link to the survey was emailed to all 196 Victorian nursing homes using a government email list, in the 2-month period January to February 2009. The nurse unit manager (NUM) of each nursing home was targeted as the most appropriate recipient. Where an email address was unavailable for the NUM, the email was sent to the director of nursing (DON) or chief executive officer (CEO) with the request that they forward the survey to the NUM if possible. For instances of invalid email addresses, attempts were made to obtain a correct email address, and the survey link was re-sent on the same day as the initial distribution. The email described the project and included an information sheet. Participants were assured of confidentiality. Nursing home staff members voluntarily completed the confidential online survey by following the link provided in the email, with submission considered to imply consent to participate.

Ibrahim et al.

829

A modified Dillman protocol was followed. Reminder emails were sent to all nursing homes at 10 and 11 days following the initial email. The survey closed 14 days after the initial email. Survey data were entered into a spreadsheet and analyzed using the statistical software package SPSS version 14. Descriptive statistics were used to analyze questions for pre-coded response options.

Results Response Rate and Respondent Characteristics Of the 196 nursing homes invited to participate, 113 (58%) completed the survey. Respondents (Table 1) were predominantly facility managers, site managers, or NUM (70/112; 63%), and over one third of the respondents (41/111; 37%) had been working in public sector nursing homes for more than 10 years. Approximately half of the nursing homes were moderate-sized organizations housing between 31 and 60 residents (55/112; 49%). Just over one half were located in a rural setting (60/111; 54%), and a similar proportion provided high-level care (57/112; 54%).

Personnel Characteristics and Education The person responsible for collecting indicator data (Table 2) was usually the NUM (39% overall). Division 1 registered nurses (RN Div1s) were the second most common group of data collectors (24% overall) followed by quality managers (16%) and Division 2 registered nurses (RN Div2s, 14%). However, there was some variation across indicators; pressure ulcer data were often collected by RN Div2s (31/112; 28%), and falls data were often collected by quality managers (31/112; 28%). Just over half of the facilities (61/110; 56%) reported that the same person had collected the indicator data for the previous two audits, while 45/110 (41%) indicated that various staff had collected the data, and 4/110 (4%) were unsure. Across all indicators, 91% of respondents reported that education, training, and support were available “always” or “most of the time” for staff recording data, and this proportion differed little across the five indicators. Assessment of familiarity with the QI definitions was lower with 83% of respondents across all indicators reporting QI collectors were assessed “always” (61%) or “most of the time” (22%) about their familiarity with indicator definitions according to the QI reference manual. Of significant concern is that 9% “never” and 4% “rarely” assessed the QI collectors.

830

Journal of Aging and Health 26(5)

Table 1.  Characteristics of Survey Respondents (N = 113). n (%) Position (n = 112)   Executive director of nursing/manager   Safety or quality improvement manager   Facility/site/nurse unit manager  Other Years in public sector nursing homes (n = 111)   2 years or less   3-5 years   6-10 years   More than 10 years Involvement in quality indicators data process (n = 113)   Gathering data for at least one indicator   Analyzing data for at least one indicator   Interpreting results for at least one indicator   Writing reports for senior management and others about the indicators   Responding to questions about reports and/or about the indicators Facility (n = 111)   Public sector nursing home-rural   Public sector nursing home-regional   Public sector nursing home-metropolitan   Health service (rural or regional)   Health service (metropolitan) Type of care provided (n = 112)   High level care beds only   Low level care beds only   Both high and low level care beds Size of facility (n = 112)   30 residents or less   31-60 residents   61 residents or more

20 (17.9) 18 (16.1) 70 (62.5) 4 (3.6) 24 (21.6) 25 (22.5) 21 (18.9) 41 (36.9) 84 (74.3) 76 (67.2) 81 (71.7) 68 (60.2) 91 (80.5)

60 (54.1) 14 (12.6) 13 (11.7) 20 (18.0) 4 (3.6) 57 (50.9) 3 (2.7) 52 (46.4) 36 (32.1) 55 (49.1) 21 (18.8)

Data Quality Data sources and tools. Data sources for QI data collection varied greatly (Table 2), particularly for pressure ulcers and physical restraint. Pressure ulcer data were sourced from assessment charts (58%), incident reports

831

Ibrahim et al. Table 2.  Quality Indicator: Data Collectors, Sources of Data, and Tools.

  QI collector (n = 112)   Nurse unit manager   Quality manager   RN 1   RN 2   Personal care attendant  Other QI source (n = 111-112)   Care plan   Assessment charts   Medical record/ progress notes   Medication charts   Incident reports   Administration record  Other QI tool (n = 111-112)   DHS QI manual   Incident reporting systems (e.g., proprietary incident reporting system)   In-house system

QI 1 Pressure ulcers

QI 2 Falls

QI 3 Physical restraint

QI 4 Medication

QI 5 Weight loss

Average All indicators

n (%)

n (%)

n (%)

n (%)

n (%)

n (%)

38 (33.4) 14 (12.5) 21 (18.8) 31 (27.7) 4 (3.6) 4 (3.6)

44 (39.3) 31 (27.7) 16 (14.3) 11 (9.8) 1 (0.9) 9 (8)

50 (44.6) 13 (11.6) 30 (26.8) 13 (11.6) 1 (0.9) 5 (4.5)

40 (35.7) 17 (15.2) 43 (38.4) 4 (3.6) 3 (2.7) 5 (4.5)

47 (42) 15 (13.4) 23 (20.5) 21 (18.8) 2 (1.8) 4 (3.6)

44 (39) 18 (16) 27 (24) 16 (14) 2 (2) 6 (5)

4 (3.6) 65 (58.0) 6 (5.4)

1 (0.9) 1 (0.9) 4 (3.6)

24 (21.6) 27 (24.3) 16 (14.4)

0 (0) 5 (4.5) 8 (7.1)

5 (4.5) 94 (83.9) 8 (7.1)

7 (6) 38 (34) 9 (8)

0 (0) 17 (15.2) 0 (.0) 20 (17.9)

0 (0) 105 (93.8) 1 (0.9) 0 (0)

0 (0) 2 (1.8) 7 (6.3) 35 (31.5)

97 (86.6) 0 (0) 2 (1.8) 0 (0)

1 (0.9) 1 (0.9) 2 (1.8) 1 (0.9)

20 (18) 25 (22) 2 (2) 11 (10)

39 (35.1) 23 (20.7)

14 (12.5) 84 (75)

46 (41.4) 3 (2.7)

40 (36) 4 (3.6)

35 (31.5) 1 (0.9)

35 (31) 24 (21)

49 (44.1)

14 (12.5)

62 (55.9)

67 (60.4)

75 (67.6)

54 (48)

Note. QI = quality indicator; RN = registered nurse; DHS = Department of Human Services.

(17%), and “other” (17%). Physical restraint data were sourced from assessment charts (24%), care plans (22%), medical notes (14%), and “other” (32%). Data tools for QI data collection also varied. Across all indicators, the most commonly used data collection tools were in-house reporting systems, followed by the Resource Manual. The exception to this was for falls, for which incident reporting systems were most often used (84/112; 75%). QI definitions. Most survey respondents reported using the same indicator definitions as for the previous two statewide audits (86/112; 77%). Of concern, 10/112 (9%) indicated that different definitions had been used and 16/112 (14%) were unsure which definition had been used. Significant numbers of respondents reported difficulty “always,” “most of the time,” or “sometimes” in defining physical restraint (34%, 37/110), falls (17%, 19/110), and medications (17%, 15/108). Ambiguity in definitions was

832

Journal of Aging and Health 26(5)

also reflected in data collection practices. For example, many forms of restraint defined in the Resource Manual were frequently excluded from the audit. There was also considerable variation between facilities in the systems used to grade pressure ulcers and processes for weighing. Data checking.  Techniques applied “always” or “most of the time” to determine data reliability were as follows: range checks in 74/105 (71%) facilities, random checks in 51/106 (48%), and checking against previous data in 97/109 (89%).

Analysis and Reporting A majority of the facilities (68/110; 61%) always analyzed the data to examine practice in their facility, and an identical proportion said that they always made the data available for analysis by interested staff members. Over two fifths (45/100; 41%) said that they always relied on the data analysis supplied by DHS. Almost all respondents (93%-100%, depending on the QI domain) reported data “always” or “most of the time” to DHS, executive staff, and safety and quality management (Table 3). More than 80% reported to the NUM (97/105) and risk management committee (77/89). Reports were most commonly distributed quarterly. Reports were infrequently provided to frontline staff (RN Div2s and personal care attendants), general practitioners (GP), residents, and their families. Just over half of respondents (60/102; 53%) replied that residents would only receive a QI report on request. Very few of the survey respondents received responses to their QI reports from residents, families, or GP. As expected, the nursing home managers responded to QI reports most often (67/108; 62%). However, this rate is lower than expected, with another 15/108 (14%) facilities saying that nursing home managers “rarely or never” responded to the QI report.

Response and Actions Prompted by QIs Across all indicators, audit results triggered individual resident reviews “always” or “most of the time” in 62% to 79% of nursing homes, staff practice reviews in 45% to 63%, and systems reviews in 45% to 55% (Table 4). Following review, beneficial changes in care for residents occurred “always” or “most of the time” in 58% to 75% of facilities. The most useful QI was “unplanned weight loss” with 94/111 (85%) respondents rating it as useful “always” or “most of the time.” The least

833

Ibrahim et al. Table 3.  Distribution and Level of Responses to Quality Indicator Reports.

QI results (report) are distributed to: Department of human services (n = 99) Nursing home—manager (n = 103) Nursing home or health service— executive (n = 97) Nursing home or health service—board (n = 90) Safety or quality improvement manager (n = 106) Health service quality/risk management committee (n = 96) Nursing home quality/risk management committee (n = 89) Registered nurse Division 1 (Nurse unit manager) (n = 105) Registered nurse Division 2 (n = 113) Personal care attendant (n = 88) Residents and families (n = 98) General practitioner or pharmacist (n = 100) Other (n = 57)

Interval at which reports are distributed: Staff (n = 108) Management (n = 110) Residents (n = 102) Nursing home quality/risk management committee (n = 105) Responses (either verbal or written) to quality indicator report are received from: Residents and family (n = 109) Staff (n = 110) Facility managers (n = 108) General practitioners (n = 108)

Always or most of the time

Sometimes

Rarely or never

n (%)

n (%)

n (%)

95 (96) 103 (100) 90 (92.8)

1 (1) 0 (0) 5 (5.2)

3 (3) 0 (0) 2 (2.1)

72 (80)

11 (12.2)

7 (7.8)

104 (98.1)

0 (0)

2 (1.9)

78 (81.3)

3 (3.1)

5 (5.2)

77 (86.5)

4 (4.5)

8 (9)

97 (92.4)

4 (3.8)

4 (3.8)

77 (68.1) 47 (53.4) 31 (31.6) 37 (37)

18 (15.9) 11 (12.5) 35 (35.7) 31 (31)

8 (7.1) 30 (34.1) 32 (32.7) 32 (32)

19 (33.3)

7 (12.3)

40 (70.2)

3 months

6 or 12 months

Only on request

n (%)

n (%)

n (%)

82 (72.6) 102 (90.3) 23 (20.4) 93 (82.3)

16 (14.2) 5 (4.4) 19 (16.8) 6 (5.3)

10 (8.8) 3 (2.7) 60 (53.1) 6 (5.3)

Always or most of the time

Sometimes

Rarely or never

n (%)

n (%)

n (%)

3 (2.8) 23 (20.9) 67 (62) 11 (10.2)

8 (7.3) 40 (36.4) 26 (24.1) 23 (21.3)

98 (89.9) 47 (42.7) 15 (13.8) 74 (68.5)

useful was physical restraint (63/111; 57%). The “unplanned weight loss” QI was also the most likely to result in beneficial changes in residents’ care (79/100; 79%).

834

Journal of Aging and Health 26(5)

Table 4.  Self-Reported Changes in Clinical Care Triggered by the Five Quality Indicators. QI 1 Pressure QI 3 Physical ulcers restraint QI 2 Falls  

n (%)

n (%)

n (%)

Activity   Review of resident (n = 110)    “Always” and “most of the 75 (68.2) 77 (70) 68 (61.8) time”   “Sometimes” 20 (18.2) 13 (11.8) 14 (61.8)    “Rarely” and “never” 15 (13.6) 20 (18.2) 28 (61.8)   Review of staff practice (n = 109)    “Always” and “most of the 67 (61.5) 65 (59.6) 59 (54.1) time”   “Sometimes” 25 (22.9) 31 (28.4) 26 (23.9)    “Rarely” and “never” 17 (15.6) 13 (11.9) 24 (22)   Review systems within the facility (n = 106)    “Always” and “most of the 52 (49.1) 53 (50) 54 (50.9) time”   “Sometimes” 34 (32.1) 34 (32.1) 23 (21.7)    “Rarely” and “never” 21 (19.8) 20 (18.9) 30 (28.3) Change in care   Reviews lead to beneficial changes in the care for residents (n = 106)    “Always” and “most of the 72 (67.9) 75 (70.8) 61 (57.5) time”   “Sometimes” 19 (17.9) 20 (18.9) 17 (16)    “Rarely” and “never” 7 (6.6) 6 (5.7) 9 (8.5)   Not applicable 8 (7.5) 6 (5.7) 19 (17.9)

QI 4 QI 5 Weight Medications loss n (%)

n (%)

72 (65.5)

87 (79.1)

26 (23.6) 12 (10.9)

14 (79.1) 9 (79.1)

48 (44)

71 (65.1)

31 (28.4) 29 (26.6)

27 (24.8) 11 (10.1)

48 (45.3)

59 (55.7)

35 (33) 23 (21.7)

33 (31.1) 15 (14.2)

63 (59.4)

79 (74.5)

26 (24.5) 10 (9.4) 6 (5.7)

17 (16.0) 4 (3.8) 5 (4.7)

Discussion This cross-sectional survey found that the majority of nursing homes in Victoria do respond to QI data about their performance. Their response was most often by reviewing the individual resident followed by changing staff practice. Reviewing systems of care was also reported by half of the respondents. This level of review was much greater than expected. Most survey participants (58%-75%, depending on the QI domain) indicated these reviews led to beneficial change in care for the resident. Survey participants considered QI “unplanned weight loss” the most useful and QI “physical restraint” the least useful of the five quality indicators to generating improvements in care. Our study did not explore the reasons for this difference. This study also identified suboptimal levels of adherence to principles of measurement with QI data in nursing homes. We noted considerable

Ibrahim et al.

835

heterogeneity in the personnel, sources, and tools used to gather the QI data (Table 2). Inconsistent definitions of the QI were also used, and some staff responsible for collecting data (14%) were not assessed on their familiarity with definitions. Data collection methods for pressure ulcers and physical restraint were particularly variable. Given these events were among the first to be recognized internationally in aged care as being associated with suboptimal care, it may be that nursing homes use their established systems for tracking these data that pre-dates the implementation of the QI program. Data checking using standard techniques of range and random checks was lower than expected, with most nursing homes relying on a comparison against previous data collection cycles. This approach is susceptible to systematic errors (Powell, Davies, & Thomson, 2003), is potentially misleading, and perpetuates the use of inaccurate and incorrect data. Distribution of the nursing homes’ performance QI report was extremely limited. The QI data remained confined to senior managerial and executive staff. Less than a third of nursing homes provided these reports to residents and their families. Future research about reporting of QI should also consider the nature (summary or comprehensive), content (narrative and or statistical), and format (soft or hard copy) as these may influence how the information is received and acted on (Ivers et al., 2012).

Implications Our results are counterintuitive. First, accurate and reliable measurement of data is considered a prerequisite for QIs to be useful (Bradley et al., 2004). Second, public reporting of QIs is an advocated mechanism to prompt health professionals and organizations to improve care (Fung, Lim, Mattke, Damberg, & Shekelle, 2008; Hibbard, Stockard, & Tusler, 2003; Hutchinson et al., 2009). Neither reliable measurement nor public reporting was present, yet substantial change was reported. Our study supports the supposition that perfect data are not essential for changing practice. Our findings reflect the experience of other programs. Challenges in data collection, interpretation, and response are common to all quality improvement initiatives, particularly in long-term aged care (Castle & Ferguson, 2010; Hanys Quality Institute, 2013; OECD/European Commission, 2013; Panzer et al., 2013). Evaluating the validity of indicators as markers for quality, benchmarking, or competition remains problematic (Lilford, Mohammed, Spiegelhalter, & Thomson, 2004; Mor, Berg, et al., 2003). However, previous studies have demonstrated that health care organizations act on quality of care data, even when they have concerns about its validity (Hibbard et al.,

836

Journal of Aging and Health 26(5)

2003). Our findings indicate that QI data have value from a “quality improvement” perspective, which considers the success of an indicator according to whether it prompts reflection and action within an organization toward enhanced quality of care (Freeman, 2002). The importance of using quality improvement data for the development of a “quality culture” perspective is increasingly emphasized (British Geriatrics Society, 2011; OECD/European Commission, 2013). Evaluation of the Minimum Data Set in the United States highlights the use of systems-wide performance data to make provider-led improvements at the local level (Lin & Kramer, 2013; Mor, Berg, et al., 2003; Panzer et al., 2013). Our study is the first to investigate the use of a standard set of QIs in nursing homes in Victoria, Australia. Survey respondents were senior staff in their organization and knowledgeable about the circumstances of QI data gathering, reporting, and response processes. We also examined practice in both metropolitan nursing homes and the smaller organizations in regional and rural settings where resources are often limited.

Study Limitations The response rate of 58% for this survey is consistent with contemporary research experience studies (Baruch & Holtom, 2008). While not optimal, it is better than expected, given the decline in response rates for surveys (Tourangeau, 2004). The logistical constraints of the project shortened the amount of time available for participants to respond. Ideally, full adherence to the modified Dillman (2007) protocol would have increased opportunities to participate. This reinforces the need for researchers to better balance research method with resource allocation. Another limitation of the study is reliance on self-reported change in practice, which generally overestimates the degree of positive change (DelgadoRodríguez & Llorca, 2004). It remains difficult to objectively quantify the impact of quality improvement programs (Øvretveit & Gustafson, 2002), and the study was not designed to gather specific information to substantiate the nature and extent of these changes to resident outcomes. Our results would reliably reflect circumstances in all the public sector nursing homes in Victoria, as our study included 113 of the 196 public sector–funded services. However, it is arguable whether our results are applicable throughout Australia, as Victoria has more public-sector funded nursing homes than any of the other seven states and territories of Australia. Nationally, only about one tenth (11%) of the 2,830 approved nursing homes nationally are funded through the public sector (Access Economics, 2010).

Ibrahim et al.

837

Also, while the not-for-profit nongovernment and privately owned and operated nursing homes do not collect these QIs and were not included in the study, the licensing, funding, and accreditation arrangements are regulated nationally, and the nature of residents and care provided is similar across facility types. This study provides the important insight that QIs in nursing homes are useful for improving resident care. We also identified gaps in the handling and reporting of QI data. Addressing these gaps would increase the accuracy of data and improve the comparative statistical analyses across the state and the overall utility of the QIs. Efforts should be directed to ensuring consistent application of indicator definitions, more robust checking of data reliability, and broadening distribution of reporting data to residents and families. A greater distribution and a fuller response to the QI data reports may prompt greater changes to existing practices and resident care. A better understanding of how QI data are acted on to improve quality of care is required to improve the efficacy of the QI process. In particular, we need more information about the use of these data to address the systems of care, which are essential to the success of quality improvement initiatives (Ferlie & Shortell, 2001).

Conclusion This study found that public sector nursing homes used QI data to improve resident care. Improving the process of measuring quality indicators through consistent application of definitions and standardization of data collection, analysis, and reporting would strengthen the program’s utility. Authors’ Note The views and conclusions in this document are those of the authors and do not necessarily represent those of the Department of Human Services, La Trobe University, the Victorian Institute of Forensic Medicine, or Monash University.

Acknowledgment The authors wish to thank the individuals who participated in the survey and also members of the Team who contributed to the establishment and completion of the project.

Declaration of Conflicting Interests The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

838

Journal of Aging and Health 26(5)

Funding The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This research project was funded by the State of Victoria through its Department of Human Services, now known as the Department of Health.

Note 1.

Here “nursing home” refers to services and accommodation that provide supervision or assistance to older persons with activities of daily living, also described as residential aged care services (RACS), convalescent home, skilled nursing facility, care home, or a rest home in other countries.

References Access Economics. (2010). The future of aged care in Australia. Canberra: National Seniors Australia, Access Economics. Australian Productivity Commission. (2011). Caring for older Australians (Productivity Commission Inquiry Report series, p. xcii). Canberra: Australian Productivity Commission. Baruch, Y., & Holtom, B. C. (2008). Survey response rate levels and trends in organizational research. Human Relations, 61, 1139-1160. Borowski, A., & MacDonald, P. (2007). The dimensions and implications of Australian population ageing. In A. Borowski, S. Encel, & E. Ozanne (Eds.), Longevity and social change in Australia (pp. 25-39). Sydney, Australia: University of New South Wales Press. Boyd, C. M., Darer, J., Boult, C., Fried, L. P., Boult, L., & Wu, A. W. (2005). Clinical practice guidelines and quality of care for older patients with multiple comorbid diseases: Implications for pay for performance. Journal of the American Medical Association, 294, 716-724. doi:10.1001/jama.294.6.716 Bradley, E. H., Holmboe, E. S., Mattera, J. A., Roumanis, S. A., Radford, M. J., & Krumholz, H. M. (2004). Data feedback efforts in quality improvement: Lessons learned from US hospitals. Quality & Safety in Health Care, 13, 26-31. doi:10.1136/qshc.2002.4408 British Geriatrics Society. (2011). Quest for quality: British Geriatrics Society Joint Working Party Inquiry into the quality of healthcare support for older people in care homes—A call for leadership, partnership and quality improvement. London, England: Author. Bruen, W. (2005). Aged care in Australia: Past, present and future. Australasian Journal on Ageing, 24, 130-133. Castle, N. G., & Ferguson, J. C. (2010). What is nursing home quality and how is it measured? The Gerontologist, 50, 426-442. doi:10.1093/geront/gnq052 Courtney, M., O’Reilly, M. T., Edwards, H., & Hassall, S. (2009). The relationship between clinical outcomes and quality of life for residents of aged care facilities. Australian Journal of Advanced Nursing, 26(4), 49-57.

Ibrahim et al.

839

Courtney, M., O’Reilly, M. T., Edwards, H., & Hassall, S. (2010). Benchmarking clinical indicators of quality for Australian residential aged care facilities. Australian Health Review, 34, 93-100. Delgado-Rodríguez, M., & Llorca, J. (2004). Bias. Journal of Epidemiology & Community Health, 58, 635-641. Dillman, D. A. (2007). Mail and internet surveys: The tailored design method. Hoboken, NJ: Wiley. Ferlie, E. B., & Shortell, S. M. (2001). Improving the quality of health care in the United Kingdom and the United States: A framework for change. Milbank Quarterly, 79, 281-315. doi:10.1111/1468-0009.00206 Freeman, T. (2002). Using performance indicators to improve health care quality in the public sector: A review of the literature. Health Services Management Research, 15, 126-137. doi:10.1258/0951484021912897 Fung, C. H., Lim, Y., Mattke, S., Damberg, C., & Shekelle, P. G. (2008). Improving patient care. Systematic review: The evidence that publishing patient care performance data improves quality of care. Annals of Internal Medicine, 148, 111-123. Giuffrida, A., Gravelle, H., & Roland, M. (1999). Measuring quality of care with routine data: Avoiding confusion between performance indicators and health outcomes. British Medical Journal, 319, Article 94. doi:10.2307/25185209 Hanys Quality Institute. (2013). Hanys report card on hospital report cards, understanding publicly reported hospital quality measures. New York: Healthcare Association of New York State. Hibbard, J. H., Stockard, J., & Tusler, M. (2003). Does publicizing hospital performance stimulate quality improvement efforts? Health Affairs, 22, 84-94. doi:10.1377/hlthaff.22.2.84 Hughes, R., Aspinal, F., Addington-Hall, J. M., Dunckley, M., Faull, C., & Higginson, I. (2004). It just didn’t work: The realities of quality assessment in the English health care context. International Journal of Nursing Studies, 41, 705-712. doi:10.1016/j.ijnurstu.2004.02.005 Hutchinson, A., Young, T. A., Cooper, K. L., McIntosh, A., Karnon, J. D., Scobie, S., & Thomson, R. G. (2009). Trends in healthcare incident reporting and relationship to safety and quality data in acute hospitals: Results from the National Reporting and Learning System. Quality & Safety in Health Care, 18, 5-10. doi:10.1136/qshc.2007.022400 Ivers, N., Jamtvedt, G., Flottorp, S., Young, J. M., Odgaard-Jensen, J., French, S. D., . . .Oxman, A. D. (2012). Audit and feedback: Effects on professional practice and healthcare outcomes. Cochrane Database of Systematic Reviews, 6, CD000259. Lilford, R., Mohammed, M. A., Spiegelhalter, D., & Thomson, R. (2004). Use and misuse of process and outcome data in managing performance of acute medical care: Avoiding institutional stigma. The Lancet, 363, 1147-1154. doi:10.1016/ S0140-6736(04)15901-1 Lin, M. K., & Kramer, A. M. (2013). The Quality Indicator Survey: Background, implementation, and widespread change. Journal of Aging & Social Policy, 25, 10-29. doi:10.1080/08959420.2012.705721

840

Journal of Aging and Health 26(5)

Mainz, J. (2003). Defining and classifying clinical indicators for quality improvement. International Journal for Quality in Health Care, 15, 523-530. doi:10.1093/ intqhc/mzg081 McNeill, J., Evans, S., Crammond, B., & Cameron, P. (2008). Guidelines for the establishment and management of clinical registries (version 2). Prahran, Australia: Monash University. Mor, V., Angelelli, J., Jones, R., Roy, J., Moore, T., & Morris, J. (2003). Inter-rater reliability of nursing home quality indicators in the U.S. BMC Health Services Research, 3(1), Article 20. Mor, V., Berg, K., Angelelli, J., Gifford, D., Morris, J., & Moore, T. (2003). The quality of quality measurement in U.S. nursing homes. The Gerontologist, 43(Suppl. 2), 37-46. doi:10.1093/geront/43.suppl_2.37 National Aged Care Alliance. (2013). Quality indicators reference group terms of reference. Retrieved from http://www.naca.asn.au/pdf/ToR/QualityI.pdf O’Reilly, M., Courtney, M., & Edwards, H. (2007). How is quality being monitored in Australian residential aged care facilities? A narrative review. International Journal for Quality in Health Care, 19, 177-182. Organisation for Economic Co-Operation and Development/European Commission. (2013). A good life in old age? Monitoring and improving quality in long-term care. Paris: OECD Publishing. Øvretveit, J., & Gustafson, D. (2002). Evaluation of quality improvement programmes. Quality & Safety in Health Care, 11, 270-275. doi:10.1136/qhc.11.3.270 Panzer, R. J., Gitomer, R. S., Greene, W. H., Webster, P., Landry, K. R., & Riccobono, C. A. (2013). Increasing demands for quality measurement. Journal of the American Medical Association, 310, 1971-1980. doi:10.1001/jama.2013.282047 Powell, A. E., Davies, H. T. O., & Thomson, R. G. (2003). Using routine comparative data to assess the quality of health care: Understanding and avoiding common pitfalls. Quality & Safety in Health Care, 12, 122-128. Sheldon, T. A. (2005). The healthcare quality measurement industry: Time to slow the juggernaut? Quality & Safety in Health Care, 14, 3-4. doi:10.1136/ qshc.2004.013185 Tourangeau, R. (2004). Survey research and societal change. Annual Review of Psychology, 55, 775-801. doi:10.1146/annurev.psych.55.090902.142040 Tsan, L., Davis, C., Langberg, R., & Pierce, J. R. (2007). Quality indicators in the department of veterans affairs nursing home care units: A preliminary assessment. American Journal of Medical Quality, 22, 344-350. Victorian Government Department of Human Services. (2007). Resource manual quality indicators in Public Sector Aged Care Services—Resource manual 20072008, version 1. Melbourne, Victoria, Australia: Rural and Regional Health and Aged Care Services Division. World Medical Association. (2008). World Medical Association declaration of Helsinki: Ethical principles for medical research involving human subjects. Retreived from: http://www.wma.net/en/30publications/10policies/b3/17c.pdf.

Use of Quality Indicators in Nursing Homes in Victoria, Australia: A Cross-Sectional Descriptive Survey.

This study aimed to characterize the use of mandated quality indicators (QIs) in public sector nursing homes by describing their adherence to establis...
365KB Sizes 0 Downloads 3 Views