AJMXXX10.1177/1062860613518093American Journal of Medical QualityKennedy et al


Improving Service Quality in Primary Care

American Journal of Medical Quality 2015, Vol. 30(1) 45­–51 © 2014 by the American College of Medical Quality Reprints and permissions: sagepub.com/journalsPermissions.nav DOI: 10.1177/1062860613518093 ajmq.sagepub.com

Denise M. Kennedy, MBA1, Jon T. Nordrum, DSc1, Frederick D. Edwards, MD1, Richard J. Caselli, MD1, and Leonard L. Berry, PhD2

Abstract A framework for improving health care service quality was implemented at a 12-provider family medicine practice in 2010. A national patient satisfaction research vendor conducted weekly telephone surveys of 840 patients served by that practice: 280 patients served in 2009, and 560 served during 2010 and 2011. After the framework was implemented, the proportion of “excellent” ratings of provider service (the highest rating on a 5-point scale) increased by 5% to 9%, most notably thoroughness (P = .04), listening (P = .04), and explaining (P = .04). Other improvements included prompt test result notification and telephone staff courtesy (each by 10%, P = .02), as well as teamwork (by 8%, P = .04). Overall quality increased by 10% (P = .01), moving the practice from the 68th to the 91st percentile of medical practices in the research vendor’s database. Improvements in patient satisfaction suggest that this framework may be useful in value-based payment models. Keywords service quality improvement, continuous improvement model, value-based purchasing, patient satisfaction The Centers for Medicare and Medicaid Services’ inclusion of data on patient perceptions in its value-based purchasing formula has stimulated debate about using nonclinical measures to determine payments.1 Citing regional and organizational differences that might influence patient perceptions arbitrarily, many health care leaders have argued that value-based purchasing should focus on clinical process metrics, not patient experience measures.2 Existing value-based payment models have emphasized quality of care, clinical outcomes, patient safety, service quality, and costs.3 Value-based purchasing has begun to spread to outpatient clinics (eg, Clinician and Group Consumer Assessment of Healthcare Providers and Systems [CG-CAHPS]) and other care settings, and a patient satisfaction component is now being included in many commercial contracts4 and physician compensation packages.5,6 Performing successfully in this payment structure requires understanding how patients perceive value and how to deliver it to them. Most patients lack the expertise to judge health care quality and so typically focus on doctors’ communication skills, the compassion of nurses, the courtesy of administrative staff, and the cleanliness of facilities.7 This article describes a comprehensive service quality improvement framework,8 implemented in 2010 at a

Mayo Clinic Arizona (MCA) family medicine practice in Scottsdale. It reports preimplementation (2009) patient survey data from 280 randomly selected MCA patients and postimplementation patient survey data from 560 patients (280 each year in 2010 and 2011). The practice employs 12 providers and 40 allied health staff (nurses, medical assistants, and nonclinical staff) who serve approximately 8000 patients annually. The practice’s payer mix is 40% government and 60% commercial; 4 large payers account for the majority of commercial revenue.

Methods Weekly telephone surveys, statistical analysis, and reports were provided by Professional Research Consultants (PRC), a national vendor with more than 30 years of experience in measuring patient satisfaction. Weekly data files of MCA’s patient population were securely 1

Mayo Clinic Arizona, Scottsdale, AZ Texas A&M University, College Station, TX


Corresponding Author: Denise M. Kennedy, MBA, Mayo Clinic College of Medicine, Mayo Clinic Arizona, 13400 E. Shea Boulevard, Scottsdale, AZ 85259. Email: [email protected]

Downloaded from ajm.sagepub.com at NATIONAL TAIWAN UNIV LIB on January 15, 2015


American Journal of Medical Quality 30(1)

transmitted to PRC. Randomly selected patients who chose to participate were stratified by department and surveyed once annually. The average annual response rate was 70%.

Measurement and Analysis MCA’s 50-question telephone survey, modeled on Medicare’s adult CG-CAHPS survey,9 asked respondents to rate a variety of service quality attributes on a 5-point scale (excellent, very good, good, fair, and poor). These attributes included provider and allied health staff service behaviors (eg, listening, respect, courtesy, responsiveness), process efficiency, facility cleanliness, perception of overall quality, and likelihood to recommend the practice. At the end of the survey, patients were asked to describe services they perceived as being of outstanding quality, as well as any negative experiences. Telephone interviewers probed vague negative responses to identify opportunities to improve performance. MCA’s internal management consultants, who directly observed the work of physicians and allied health staff, generated qualitative data on service performance, practice operations, and facility attributes. Regression analyses of the 2009 survey data identified the strongest contributors to patients’ perceptions of overall quality: efficiency, provider courtesy and caring, and access to medical appointments. The meaning of “efficiency” can differ from patient to patient, so further regression analyses were performed to identify the service attributes associated with efficiency: teamwork among doctors, nurses, and staff; courtesy and friendliness of nurses; courtesy and helpfulness of telephone staff; and promptness in informing patients of test results. Improvements in patients’ perception of service quality were analyzed for significance. Z tests were performed at a .05 significance level. Data were analyzed with Statistical Package for the Social Sciences version 19 (IBM SPSS, Chicago, IL).

The Service-Quality Improvement Framework The framework implemented at MCA included 7 basic principles, described below. Use Multiple Data Sources.  First, MCA identified broad metrics of service quality, primarily patient satisfaction and complaint data, telephone operations data, and each patient’s self-identified reason for scheduling an appointment. Patient satisfaction data were monitored frequently using PRC’s online reporting tool. Data were refreshed as telephone surveys were completed, allowing practice leaders (chair, operations administrator, nurse manager, and call center supervisor) to readily monitor perceptions

of service quality relative to the goal (the 90th percentile of practices in PRC’s database). Custom satisfaction reports were developed for specific groups (providers, nurses, and call center/reception desk staff) and e-mailed automatically to the managers of these areas. A dashboard with key call center goals—less than 20 seconds to answer calls and a call abandonment rate of less than 5%—was reviewed monthly with staff. Stoplight color coding identified improvement opportunities: at or above goal (green), intermediate negative variance from goal (yellow), and greatest negative variance from goal (red). On the Reason for Appointment form,10 patients were asked why they had scheduled a visit (eg, to refill a prescription, to ask a provider to complete an employment or sports physical health document). The provider addressed all patient-noted priorities during the visit. Create a Culture of Accountability.  Accountability for service quality was created at several points in MCA’s organizational structure. The chief executive officer reviewed service quality metrics and plans for improvement (where indicated) in monthly meetings with department chairs. MCA’s service committee also monitored the results and reported them quarterly to the governing board and to the clinical quality oversight group (Clinical Practice Committee). The Clinical Practice Committee requested action plans and 90-day progress reports from chairs and administrators of departments that were performing below goal. This closed-loop reporting process encouraged department leaders to act promptly on service deficiencies. Provide Service Consultation and Improvement Tools. The practice consulted with MCA’s service administrator for several months to perform objective audits of service quality and analysis of deficiencies. A high-level process map showing prestudy (2009) satisfaction ratings at key touchpoints (eg, appointment office, reception desk, waiting lobby, exam room) rendered opportunities for improvement visually (Figure 1). A final report of the service audit findings,11 improvement opportunities, recommendations, and tools and resources (Figure 2) helped practice leaders develop their improvement plan. Like most departments, this practice used service consultation more intensely early in the process when the improvement plan was being developed and service standards were being implemented. Communicate the Service Standards.  MCA’s service values and behaviors (“SERVE”) were developed to support improvement of the patient experience. These 5 service standards, applicable to any role, were incorporated into the practice’s service education and training:

Downloaded from ajm.sagepub.com at NATIONAL TAIWAN UNIV LIB on January 15, 2015


Kennedy et al

Figure 1.  A high-level process map, showing prestudy (2009) satisfaction ratings at key touchpoints, visually rendered key opportunities for improvement of service quality. Abbreviation: Exc, Percentage of patients in 2009 rating quality as “excellent.”

•• Solutions focused (Solve problems when and where they occur) •• Empathetic (Treat everyone as you wish you or your family to be treated) •• Reliable (Own the work; if you don’t have the answer, find it) •• Valuing others (Protect patient and employee confidentiality) •• Exceed patient expectations (Contribute to an unparalleled experience) Provide Regular Education and Training.  On the basis of the patient satisfaction ratings, providers and allied health staff were educated and trained separately to address each group’s role in the patient experience. Education focused on service delivery challenges, such as variations in performance among employees, touchpoints, and face-toface interactions12; patient satisfaction surveys and the benefits of achieving the highest survey ratings13; common service complaints14; restoring a patient’s confidence when a service problem occurs15; and the benefits of excellent service performance (market differentiation,

loyalty, likelihood to recommend, competitive advantage, and downstream revenue). Service standards, performance monitoring, and appraisal processes were reviewed. An organization-level gap analysis identified service gaps16 and opportunities for improvement. Practice leaders themselves participated in the educational sessions, which were conducted quarterly. Evaluation of the education and training sessions indicated that participants understood what they were taught. In their own educational sessions, providers learned about sampling methodology and margin of error, so that they could better appreciate the patient perception data. To support ongoing professional practice evaluation,17 all providers attended a day-long, doctor-facilitated communication workshop. In addition, 1 provider was assigned a shadow coach, who observed personal interactions with patients and helped the physician with communication skills. Monitor Department Progress Continually.  Practice leaders developed their own closed loop for continuously reviewing, reporting, and acting on service quality data.

Downloaded from ajm.sagepub.com at NATIONAL TAIWAN UNIV LIB on January 15, 2015


American Journal of Medical Quality 30(1)

Telephone service dashboard

Time-to-answer and call abandonment rates relative to benchmarks are evaluated regularly in staff meetings and displayed so that staff can monitor their own improvement.

Satisfaction survey vendor’s online reporting tool

Custom reports are e-mailed to providers and practice managers to increase awareness of patient satisfaction and promote continuous improvement.

Provider communication skills building

Resources include a day-long, interactive group workshop, mentoring, or more intensive one-on-one coaching services. Providers may self-refer or be referred by their department chairs.

Service consultation

The service administrator, an internal consultant to management, provides objective analysis and expertise, and serves solely as an improvement resource. Accountability and oversight are achieved through other means.

Service education and training

These efforts give staff a better understanding of the basic principles and practices of service quality and of how employee behaviors influence customers’ perceptions of quality. Content and interactive exercises are customized by department.

Service auditing

Auditing includes objective analysis of a department’s service environment, patient flow, and staff performance. A final report of observations and recommendations is provided.

Process map of satisfaction at key experience touchpoints

By plotting data on patient satisfaction, this high-level process map visually renders opportunities to improve service quality.

Performance monitoring checklists

The checklists standardize expectations and performance, and they encourage process control to sustain improvements.

Action plan template

This template identifies issues, potential solutions, accountable people, and completion dates.

Closed-loop service quality oversight structure

This systematic process for evaluating, disseminating, and acting on service quality data enhances accountability for improvements in performance.

Figure 2.  Ten representative service improvement resources and tools.

Providers, nurses, and call center/reception desk staff reviewed data in their respective meetings. Improvements were made, data were evaluated, and trends were posted in work areas, so that staff could monitor their own progress. To sustain progress, staff reviewed data frequently, provided suggestions for improvement, and helped design and monitor tests of change. Nurses were asked to describe the service behaviors that patients are most likely to expect from them. With their input, nursing service standards and a performance monitoring form were developed. The nurse manager periodically conducted unannounced direct observation of service performance against the standards. Recognize and Reward Achievement.  In 2011, MCA’s family medicine practice surpassed the 90th percentile goal for patients’ perception of overall quality and received its first 5-star award at MCA’s annual Service Excellence Day celebration. The chair invited a few frontline staff members to accept this award on behalf of the department. Other forms of recognition included “thank you grams,” in which staff members expressed gratitude to one another and movie passes for employees who had achieved outstanding service performance. Positive patient comments were shared via group e-mails, and

employees identified by patients in a “Mention My Name” campaign were recognized for their excellent service.

Results The focus of this initiative was to improve service quality and patient perception of excellence. The majority of patients sampled (n = 191, 69% in 2009) already rated their overall experience as “excellent” before the improvement framework was implemented. Therefore, improving behaviors and processes substantially enough to “move the metrics” required a comprehensive approach. Table 1 compares the percentages of patients who gave an “excellent” rating to providers and to allied health staff during the prestudy year (2009) and the last study year (2011). Statistically significant improvements are noted. Representative prestudy and poststudy comments from patients also are included. From 2009 to 2011, patient ratings of overall quality of care as “excellent” increased by 10% (P = .01), moving the practice from the 68th to the 91st percentile of medical practices in PRC’s database (Figure 3). During that same period, ratings of specific provider services as “excellent” increased by 5% to 9% across 8 attributes, with statistically significant improvements in

Downloaded from ajm.sagepub.com at NATIONAL TAIWAN UNIV LIB on January 15, 2015


Kennedy et al Table 1.  Percentages of Patients Who Gave an “Excellent” Rating to Providers and to Allied Health Staff During the Prestudy Year (2009) and the Last Study Year (2011). 2009  

Allied health staff


Survey Question Thoroughness of medical exam Spending enough timea Listening to patient concernsa Using understandable words and termsa Explaining medical conditiona Involving patient in decisions about carea Showing courtesy and caring Being on time for the appointment Overall quality of care from provider Representative patient comments Teamwork between doctors, nurses, and staff Promptly informing patient of test resultsa Courtesy/friendliness of nurses and medical assistants Courtesy/helpfulness of telephone staffa Access to appointments when neededa Ease of access by phonea Perception of efficiency Overall quality of carea Representative patient comments



Percentage Excellent

273 278 280 280 273 273 280 279 280

62 69 67 70 65 65 74 54 71 “I liked the interaction with the physician.” 267 56 244 52 277 66 269 51 276 48 239 38 279 60 278 69 “I would have liked better customer service from the staff.”


Percentage Excellent

P Value

274 70 .04 277 74 NS 274 75 .04 279 77 NS 271 73 .04 273 72 NS 279 80 NS 277 59 NS 279 78 NS “He seems like he has nothing else to do besides talk to me and answer questions.” 270 64 .04 258 62 .02 275 70 NS 271 61 .02 277 53 NS 242 45 NS 279 66 NS 279 79 .01 “I felt the whole thing was an excellent experience and very well organized.”

Abbreviation: NS, nonsignificant. Service attributes likely to be measured with CG-CAHPS (Clinician and Group Consumer Assessment of Healthcare Providers and Systems).


made to frontline operations, in the wake of implementing a new electronic medical record in 2010, ratings of the courtesy and helpfulness of the telephone staff as “excellent” increased by 10% (P = .02), ease of telephone access by 7% (not significant), and efficiency by 6% (not significant). Across all service attributes, the percentage of excellent ratings (top box on the rating scale) increased, and the percentage of poor (lowest) ratings either remained constant or decreased.

Discussion Figure 3.  From 2009 to 2011, patient ratings of overall quality of care as “excellent” improved significantly—an absolute increase of 10% (P = .01). Each data point is an annual year end percentage excellent rating. The arrow indicates implementation of the service quality improvement model in 2010 (n = 280 patients per year).

thoroughness of exams (P = .04), listening to patient concerns (P = .04), and explaining medical conditions (P = .04). Ratings of the courtesy and friendliness of nursing staff increased by 4% (not significant). Prompt notification about test results, typically a call-back function of nursing, increased by 10% (P = .02). The coordinated activity to increase use of the patient portal is likely to have contributed to an increase in the perception of teamwork by 8% (P = .04). After structural changes were

This study describes a multifaceted service quality improvement model, implemented in MCA’s family medicine practice, and its impact on patient satisfaction. Results are consistent with previously published data on the model after it was implemented in other MCA units.8 For the first time, allied health staff service performance was measured after implementation of the model. Overall, this study offers several lessons for providers and administrators. First, the model should be implemented in its entirety because each component plays its own mutually reinforcing role in improving service quality. Successful implementation requires a culture of accountability, the department chair’s visible championship, consultation with an internal service administrator, and standard processes to sustain departmental improvements.

Downloaded from ajm.sagepub.com at NATIONAL TAIWAN UNIV LIB on January 15, 2015


American Journal of Medical Quality 30(1)

Second, the survey questions should be shared with providers to increase their awareness of the service attributes being evaluated by patients. Satisfaction ratings also should be shared to promote self-improvement. For example, perception data of overall provider quality and relative rankings were shown in a blinded bar graph during training, quarterly medical staff meetings, and performance reviews. Providers who requested more frequent feedback were e-mailed their satisfaction data and positive patient comments monthly. Negative survey comments and patient complaints were discussed with providers individually. In the case of the physician who received shadow coaching, feedback from the coach increased the physician’s awareness of dissatisfying behaviors, such as poor eye contact, lack of therapeutic touch, and use of medical jargon. Another physician, when given downward-trending data on respecting the patient’s personal values and involving the patient in care-related decisions, gained insight into patient requests to change doctors. These anecdotal examples suggest a benefit in providing patient perception data to all physicians, not just those who request them. Third, administrators should distinguish between operational issues that are not entirely within the provider’s control and behavioral issues that are entirely within the provider’s control. For example, learning to use a new electronic medical record in 2010 made some providers less efficient and late for patient appointments. Making this distinction builds trust between providers and administrators, increases awareness of service behavior deficiencies, holds all providers to the same service standards, and preserves the quality of provider–patient face time. Fourth, information technology affords opportunities to improve service quality and patient satisfaction. MCA’s use of its survey vendor’s online reporting tool enabled patient-satisfaction data to be monitored frequently. Basic statistical tools that complement direct observation of staff can help managers identify which service gaps require the most attention. Real-time tracking of actual performance relative to goals, rendering data graphically, and automatically e-mailing findings at regular intervals all help foster improvements. Patient portal technology can help improve satisfaction by making test results more readily available. Although notification about test results correlates with patient satisfaction,18 failure to notify outpatients of results remains all too common.19 Providers can encourage their patients to access routine test results via the portal. Managers should identify which patients are most likely to use the portal, enable patients to retrieve test results at their convenience, provide take-home guidelines for creating and accessing patient accounts, and assist interested patients on-site.

The service improvement lessons from MCA can be adapted for medical practices of any size. Smaller practices without a service administrator can enlist enthusiastic, service-minded staff to spearhead improvement efforts. Patient perceptions can be measured with short, point-of-service surveys. Interns recruited from local universities can be a cost-effective resource for survey administration, data analysis, and timely improvement planning. Finally, patients evaluate health care service quality relative to their other customer service experiences, and numerous publications offer practical advice on how to improve service.20-22 Although perceptions of service quality improved during the study, appointment access and nursing service still require attention. Given the educational role of the nurse in reform-related health care delivery models, development of communication skills will be especially important. Patients also must be prepared for increasing involvement of nurse practitioners and physician assistants and must be assured of receiving high-quality care from all providers. Physicians can ease patients into these new care models by introducing them to other members of the care team. Midlevel practitioners can reassure patients by emphasizing their relationship with the patient’s physician.

Limitations This study was conducted in 1 family medicine practice with 12 providers. Stratified random sampling of the patient population at the department level yielded relatively small sample sizes at the individual provider level. Analysis of aggregated provider-level data suggests that the results may be more generalizable and more useful in larger samples. Also, because randomly selected MCA patients are first notified by letter and may opt out of survey participation, there is a delay between when patients receive their care and when the satisfaction data are collected. Surveying patients at the point of service or soon thereafter may improve data quality.

Conclusion The Affordable Care Act has changed the dynamics of the health care marketplace. More people are expected to access the system. Some will make the transition from relatively low-cost, employer-sponsored insurance to buying their own insurance. Patients may have higher expectations and shop for services more judiciously, and they may leave those providers who do not satisfy their needs more quickly. Increased consumerism in health care requires a culture of accountability and a framework for collecting and acting on patient perception data.

Downloaded from ajm.sagepub.com at NATIONAL TAIWAN UNIV LIB on January 15, 2015


Kennedy et al MCA’s comprehensive data- and accountabilitydriven improvement model is a long-term approach to creating value by improving patients’ service experiences. Value-based purchasing, medical homes, and accountable care organizations are designed to enhance not only the technical quality of health care but also patients’ experiences. Improving service is the right thing to do for the patient and, in a value-based payment model, helps sustain an organization for the future. Declaration of Conflicting Interests The authors declared no conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding The authors received no financial support for the research, authorship, and/or publication of this article.

References 1. McKinney M. Sneak preview: hospitals grateful for advanced look at how value-based purchasing will work but grumble about its complexity. Mod Healthc. 2012;42:67. 2. McKinney M, Robeznieks A, Daly R. Unsatisfactory: HHS new payment system for hospitals rewards quality, but not everyone likes the final rules. Mod Healthc. 2011;41:6-7. 3. Smoldt RK, Cortese DA. Pay for performance or pay for value? Mayo Clin Proc. 2007;82:210-213. 4. United Healthcare Online. Primary care and ER care management. https://www.unitedhealthcareonline.com/ b2c/CmaAction.do?channelId=c208e744d43b4110VgnVC M1000007740dc0a Accessed December 3, 2013. 5. American Medical Association News. Patient satisfaction: when a doctor’s judgment risks a poor rating. http://www. amednews.com/article/20121126/profession/311269934/4/. Accessed December 3, 2013. 6. McKinney M. Following in CMS footsteps: NY public system tries its own pay-for-performance. Mod Healthc. 2013;43:10-11. 7. Berry LL, Bendapudi NM. Clueing in customers. Harv Bus Rev. 2003;81:100-106, 126.

8. Kennedy DM, Caselli RJ, Berry LL. A roadmap for improving healthcare service quality. J Healthc Manag. 2011;56:385-402. 9. CAHPS. CAHPS clinician and group surveys. https://cahps. ahrq.gov/surveys-guidance/docs/1351a_Adult12mo_ Eng_20.pdf. Accessed December 3, 2013. 10. Beeson SC. Practicing Excellence: A Physician’s Manual to Exceptional Healthcare. Gulf Breeze, FL: Fire Starter Publishing; 2006. 11. Kennedy DM, Caselli RJ, Berry LL. Healthy returns: an inside look at Mayo Clinic Arizona’s 7-step approach to improve service and delight customers. Qual Prog. 2012;October:32-39. 12. Normann RA. Service Management: Strategy and Leadership in Service Businesses. Chichester, UK: John Wiley; 2001. 13. Otani K. Patient satisfaction: focusing on excellent. J Healthc Manag. 2009:54;93-103. 14. Berry LL. Discovering the Soul of Service: The Nine Drivers of Sustainable Business Success. New York, NY: Free Press; 1999. 15. Berry LL, Leighton JL. Restoring customer confidence. Mark Health Serv. 2004;24:15-19. 16. Parasuraman A, Zeithaml VA, Berry LL. A conceptual model of service quality and its implications for future research. J Mark. 1985;49:41-50. 17. The Joint Commission. Standards BoosterPak for focused professional practice evaluation/ongoing professional practice evaluation (FPPE/OPPE). http://www.mc.vanderbilt. edu/documents/CAPNAH/files/Forms/Competency%20 Evaluation%20Forms/TJC%20Booster%20Pack%20 FPPE-OPPE.pdf. Accessed March 25, 2013. 18. Meza JP, Webster DS. Patient preferences for laboratory test results notification. Am J Manag. 2000;6:1297-1300. 19. Casalino LP, Dunham D, Chin MH, et al. Frequency of failure to inform patients of clinically significant outpatient test results. Arch Intern Med. 2009;169:1123-1129. 20. Bendapudi NM, Berry LL, Frey KA, Parish JT, Rayburn WL. Patient perspectives on ideal physician behaviors. Mayo Clin Proc. 2006;81:338-344. 21. Berry LL. On Great Service, A Framework for Action. New York, NY: Free Press; 1995. 22. Zimmerman A. The Service Payoff: How Customer Service Champions Outserve and Outlast the Competition. London, UK: Peak Performance Publishers; 2011.

Downloaded from ajm.sagepub.com at NATIONAL TAIWAN UNIV LIB on January 15, 2015

Improving service quality in primary care.

A framework for improving health care service quality was implemented at a 12-provider family medicine practice in 2010. A national patient satisfacti...
539KB Sizes 3 Downloads 0 Views