145

AprilYJune & 2016

The role of medical group practice administrators in the adoption and implementation of Medicare’s physician quality reporting system Robert Coulam John Kralewski Bryan Dowd David Gans Background: Although there are numerous studies of the factors influencing the adoption of quality assurance (QA) programs by medical group practices, few have focused on the role of group practice administrators. Purpose: To gain insights into the role these administrators play in QA programs, we analyzed how medical practices adopted and implemented the Medicare Physician Quality Reporting System (PQRS), the largest physician quality reporting system in the United States. Methodology: We conducted focus group interviews in 2011 with a national convenience sample of 76 medical group practice administrators. Responses were organized and analyzed using the innovation decision framework of Van de Ven and colleagues. Findings: Administrators conducted due diligence on PQRS, influenced how the issue was presented to physicians for adoption, and managed implementation thereafter. Administrators’ recommendations were heavily influenced by practice characteristics, financial incentives, and practice commitments to early adoption of quality improvement innovations. Virtually, all who attempted it agreed that PQRS was straightforward to implement. However, the complexities of Medicare’s PQRS reports impeded use of the data by administrators to support quality management. Discussion: Group practice administrators are playing a prominent role in activities related to the quality of patient careVthey are not limited to the business side of the practice. Especially, as PQRS becomes more nearly universal after 2014, Medicare should take account of the role that administrators play, by more actively engaging administrators in shaping these programs and making it easier for administrators to use the results. Practice Implications: More research is needed on the rapidly evolving role of nonphysician administration in medical group

Key words: group practice administrators, medicare, PQRS, quality improvement Robert Coulam, PhD, JD, is Professor of Practice in Health Policy and Management, Simmons College, Boston, Massachusetts. E-mail: [email protected]. John Kralewski, PhD, is Senior Research Fellow, Medica Research Institute, Minnetonka, Minnesota, and Professor Emeritus University of Minneapolis, Minnesota. Bryan Dowd, PhD, is Professor, University of Minnesota, Minneapolis, Minnesota. David Gans, MS, ACMPE, is Senior Fellow Industry Affairs, Medical Group Management Association, Englewood, Colorado. The authors have disclosed that they have no significant relationship with, or financial interest in, any commercial companies pertaining to this article. DOI: 10.1097/HMR.0000000000000061 Health Care Manage Rev, 2016, 41(2), 145Y154 Copyright B 2016 Wolters Kluwer Health, Inc. All rights reserved

Copyright © 2016 Wolters Kluwer Health, Inc. All rights reserved.

146

Health Care Management Review

AprilYJune & 2016

practices. Practice administrators have a larger role than commonly understood in how quality reporting initiatives are adopted and used and are in an exceptional position to influence the more appropriate use of these resources if supported by more useful forms of quality reporting.

A

lthough there are numerous studies of the factors influencing the adoption of quality assurance (QA) programs by medical group practices, few have focused on the role of the group practice administrator. This is true, notwithstanding over a decade of more active research focused on the quality of health care and the shortcomings of the health care system, in the wake of the Institute of Medicine’s (IOM) sentinel publication, ‘‘Crossing the Quality Chasm’’ (IOM, 2001). Studies have focused on a broad range of issues including the development of quality measures (Dowd et al., 2013; IOM 2001; Scholle et al., 2009), the influence of medical practice organizational structures on quality performance (Friedberg et al., 2009; Kralewski, Dowd, Heaton, & Kaissi, 2005; Kralewski, Dowd, Knutson, Savage, & Tong, 2013; Shortell et al., 2005; Solberg et al., 2009), and factors influencing the adoption of QA programs by hospitals and physicians in medical group practices (Kaissi, Kralewski, Curoe, Dowd, & Silversmith, 2004; Kocher & Sahni, 2010; Solberg et al., 2013). However, few studies have focused on the role of health care administrators in the adoption and implementation of QA programs, and to our knowledge, none has focused on the role of medical group practice administrators. Because nearly half of all U.S. physicians now practice in medical groups (Welch, Cuellar, Bindman, & Stearns, 2013), the lack of better information about the role of these administrators in QA activities is a serious void in the body of knowledge important to the national quality improvement effortV especially for smaller medical groups where administrators are more closely involved in day-to-day decisions related to patient care (Kralewski, Pitt, & Shatin, 1985; Stearns, 1999). Practice managers are at the intersections among external reporting requirements, payer ‘‘value’’ concerns, and the concerns of clinicians viewing the practice from the perspective of the private practice of medicine. How administrators balance these perspectives and effectively manage QA initiatives is the focus of this manuscript. In this analysis, we use the implementation of Medicare’s Physician Quality Reporting System (PQRS) to gain insights into the role of practice managers. We review the methods used by medical group practice administrators as they worked with their physicians to determine whether to join the program and how they then implemented the system if they chose to join it.

Background: The PQRS System The PQRS is the largest and most comprehensive outpatient quality reporting system in the United States. PQRS

creates a system for ‘‘eligible professionals’’ (primarily, but not exclusively, physicians) to report patient-level data on health care quality beyond the measures required to bill Medicare for services. (For simplicity, we term reporting professionals ‘‘physicians’’ throughout this article.) There are financial incentives for physicians to accept the PQRS reporting requirements. For the period 2009Y2010, physicians received bonuses as high as 2.0% of their Part B allowed charges during the reporting period. The Patient Protection and Affordable Care Act of 2010 (P.L. 111Y148) reduces incentive bonuses from 2011 through 2014 and then substitutes penalties for bonuses in 2015. (In addition, starting in 2015, quality measures for the computation of the ‘‘value-based modifier’’ for Medicare Part B payments will be based on PQRS reports.) PQRS represents the first time Medicare has created a system to generate practice-specific quality reports from outpatient treatment of Medicare patients. The PQRS measures themselves (287 measures in 2014) are generally developed and approved in the first instance by organizations such as the National Committee for Quality Assurance and medical specialty societies. Centers for Medicare & Medicaid Services (CMS) rarely develops measures itself, although it does go through a formal and relatively public process to select them. Some of the measures are purely administrative (e.g., use of an electronic health record [EHR] or e-prescribing), but the quality of care measures address areas such as preventive care, chronic and acute care management, procedurerelated care, and care coordination. Some measures represent undesirable outcomes (e.g., poor hemoglobin control in patients with diabetes), whereas others represent desirable outcomes (e.g., percentage of patients with coronary artery disease who were prescribed aspirin or clopidogrel). Originally, individual physicians were required to report at least three quality measures for at least 80% of the beneficiaries who were eligible for each measure. CMS later relaxed that requirement. By 2011, the year of our study, CMS had increased the range of acceptable reporting options to make it easier for different types of practices with different capabilities to participate.

Conceptual Framework Our analysis is based on a conceptual model of the adoption of innovations that builds on the work of Van de Ven, Polley, Garud, and Venkataraman (1999) and our own work on the adoption and use of EHRs and e-prescribing technologies by medical group practices (Gans, Kralewski, Hammons, & Dowd, 2005; Kralewski et al., 2007). Van de Ven et al.

Copyright © 2016 Wolters Kluwer Health, Inc. All rights reserved.

147

Role of Group Practice Administrators in PQRS

identified several phases that are common to the innovation process in industrial organizations. Each phase presents unique challenges that must be resolved to adopt and implement a new product or procedure. These challenges vary by type of organization and type of innovation. However, in the Van de Ven scheme, there are four overarching phases that are common to most innovation efforts: (a) information gathering and becoming knowledgeable about the innovationVthe ‘‘due diligence’’ phase; (b) decision-making regarding whether to adopt; (c) the implementation process following a decision to adopt; and (d) the lessons learned phase that is part of the learning organization paradigm. In our study, a decision to adopt PQRS promised practices some additional revenue and the opportunity to develop new quality reporting capacities but, in principle, threatened to burden physicians with clerical and other new administrative tasks. We found that administrators typically took the lead in due diligence efforts and adoption decisions to avoid placing that burden on physicians. The use of the information that comes out of PQRS for active QI purposes can be viewed as a clinical/technological issue, likely more influenced in the end by physicians and clinicians than by administrators, but still potentially influenced by some administrators seeking ways to develop and use quality measures to support internal QI efforts by clinicians. As suggested by Kimberly and Evanisko (1981), administrative innovations require a different strategy than technological innovations. The PQRS program involves both an administrative dimension and the technological development of clinical data. But in the early years, the administrative dimensions were predominant. That could change, as we discuss below. How administrators accomplished the due diligence, adoption, implementation, and follow-up tasks of PQRS provides insights into how they balance the commitments of physician-driven organizations with the need to introduce innovations that may assure continuing success in the future. The PQRS program presented both of these challenges.

Data and Methods Data for our study were obtained from focus group interviews with a national convenience sample of medical group practice administrators who preregistered for two national meetings sponsored by the Medical Group Management Association (MGMA) during 2011. MGMA randomly selected 90 administrators and then recruited them by e-mail or phone to take part in the sessions. If a candidate declined to join the sessions, he or she was randomly replaced from the preregistrant pool to maintain the initial sample size of 90. Fourteen of the 90 practice members who agreed to participate dropped out for various reasons (didn’t attend the meeting, schedule conflicts developed at the meeting, etc.). Data on the remaining 76 practice administrators are described in the Findings section below.

Each of the eight roundtable focus group interviews was led by a senior health services researcher with substantial research experience, joined by a member of MGMA’s research staff. The discussion groups themselves ranged in size from 8 to 11 practice managers and were scheduled to last approximately one hour, but usually lasted about a halfhour longer because the administrators wanted to expand or clarify some of the responses. The interviews focused on the innovation phases identified by Van de Ven et al. (1999) and were guided by how the challenges of due diligence, adoption, implementation, and follow-up were resolved. The discussion guide for the focus groups contained general questions to prompt discussion, with opportunities for follow-up and probes by the researchers (copies of the participant and nonparticipant protocols are available online at http://sph.umn.edu/site/docs/hpm/degrees/hsrpa%20phd/ PQRS%20Interview%20Guide%20-%20Participant.pdf and http://sph.umn.edu/site/docs/hpm/degrees/hsrpa%20phd/ PQRS%20Interview%20Guide%20-%20NonParticipant.pdf, respectively). When a focus group was completed, the two researchers separately wrote up their notes and then discussed their findings. The main themes from the interviews were identified by each of the attending researchers and were then discussed to reconcile interpretation of comments and differences in emphasis. A final document with an agreed-upon set of takeaway points was then created for our analysis of findings. From preliminary data collection after the sample was selected, we knew the basic characteristics of each practice on a set of size, ownership, geographic location, PQRS participation, and other dimensions. In the nature of the open-ended discussions, it was possible to pause and take counts of particular responses for some especially important issues, but we could not follow that practice repeatedly without disrupting the conversation. Thus, our notes of these sessions quote specific respondents, characterize the sentiments of the group discussion, but only selectively offer discrete counts.

Findings The overall distribution of the practices in our sample is summarized in Table 1. Seventy-nine per cent of the practices in our sample are physician owned, and 54% have 10 or fewer full-time equivalent (FTE) physicians. Sixty-two percent are PQRS participants. This is a higher rate of PQRS participation than reported nationally by Medicare for physicians generallyV possibly because (a) MGMA members are likely to be from practices more actively engaged in quality improvement discussions, (b) MGMA provided extensive information about PQRS to its members, and (c) administrators who are actively involved in PQRS are likely to be more interested in joining a focus group to talk about it with their peers. Because much of the information we sought was about choices

Copyright © 2016 Wolters Kluwer Health, Inc. All rights reserved.

148

Health Care Management Review

AprilYJune & 2016

Table 1

Focus groups: characteristics of practices and administrators (N = 76) Characteristic

n

Percent

Ownership Hospital 16 Physician/other 60 Size e10 41 11Y50 26 950 9 Participation in the physician quality reporting system? Yes 47 No 29 Location type Urban 33 Rural 16 Mixed 27

21% 79% 54% 34% 12%

62% 38% 43% 21% 36%

to participate, we ran separate focus groups for PQRS participants and nonparticipants and generally analyzed the results separately, as will be clear below.

The differences between PQRS participants and nonparticipants are summarized in Table 2. In our sample, rates of participation in PQRS were much higher for hospitals/integrated delivery systems (81%) compared to physician-owned practices (57%); larger practices (over 80% of practices with 11 or more physicians participated) compared to smaller practices with 10 or fewer physicians (only 44%); multispecialty practices (81%) versus single specialty practices (44%); and practices with EHRs (67%) versus those without EHR (48%), a result consistent with other studies that find quality reporting efforts generally to be more practical in the presence of EHR (Berman et al., 2013). There was little difference in the urban/rural/other mix of participating and nonparticipating practices in our sample. Of all these characteristics, differences in proportion for size, ownership, and practice type are statistically significant at conventional levels (p = .05); differences in EHR and location are not. Before participation in PQRS, QA reporting was not extensive among our focus group practices. Only 20 of the practices, out of a total of 76, had processes in place to regularly report defined measures to an external agency. Practices were generally conducting patient satisfaction surveys and collecting data for accreditation, but few (e.g., only 9 of the 47 participating practices) described ‘‘extensive’’

Table 2

Rates of participation in the physician quality reporting systemVby practice characteristics (N = 76) PQRS status Characteristic Practice ownership Physician/other Hospital/IDS Total Practice size (FTE physicians) e10 11Y50 950 Total Practice type Single specialty Multispecialty Total Availability of electronic health record in the practice No electronic health record Have electronic health record Total Practice location Urban Rural Mixed urbanYrural Total

Participants

Nonparticipants

Total

n

57% 81%

43% 19%

100% 100%

60 16 76

44% 81% 89%

56% 19% 11%

100% 100% 100%

41 26 9 76

44% 81%

56% 19%

100% 100%

48 28 76

48% 67%

52% 33%

100% 100%

56 20 76

62% 59% 63%

38% 41% 37%

100% 100% 100%

42 15 19 76

Copyright © 2016 Wolters Kluwer Health, Inc. All rights reserved.

149

Role of Group Practice Administrators in PQRS

efforts to collect and review quality measures internally (e.g., A1Cs, patient registries, and the like). Thus, although some practices in our sample had extensive QI programs, it appeared that most did not.

The Four-Phase Innovation Process 1. Due diligence phase. Practice administrators reported having two main sources of information about PQRS. First, CMS provided detailed information about the program, and in general, the administrators found the information to be useful. However, some noted receiving conflicting messages about the reporting options. Second, the MGMA has had a well-organized program to help administrators understand new programs such as PQRS, and the respondents in our roundtable discussions gave MGMA high marks for their efforts. Although the administrators appeared to be reasonably well informed about the program, few knew about the registry method of reporting, which substantially reduces the reporting workload. (Under the registry option, a group practice submits claims data electronically to a private contractor qualified by CMS, who in turn submits individual measures to CMS.) Only one practice administrator took a physician to visit a participating practice to see how PQRS worked. For all practices represented in our focus groups, those that ultimately chose to participate and those that did not, the process of gathering information and deciding whether or not to participate was driven by practice administrators, not physicians. In part, practices viewed PQRS as an administrative and cost issue, not a clinical issue, because PQRS rewards practices for the act of reporting, not for their clinical performance on the measures reported. Moreover, the measures themselves tended to be process oriented, rather than outcome oriented. This alone did not mean that PQRS measures could not be used for internal QA purposes, only that the early stages of due diligence and adoption were largely in administrators’ hands and not part of a larger quality improvement discussion. The extent of the administrators’ investigation of the PQRS program and of its potential for their practices was driven largely by three main considerations. First, administrators evaluated the potential return on the investment of time and effort in gross terms. For example, Medicare patients accounted for less than 5% of the patient population in some practices. These administrators did not devote much time to learning about the program and simply informed their physicians that PQRS was another CMS program that did not have much immediate relevance to their practices. Second, those with a more significant proportion of Medicare patients went further and evaluated the reporting costs and the potential payment. The third issue managers investigated was how well PQRS would likely fit with administrative systems and

practice structures. If PQRS fit well with EHRs, for example, or if the practice found that it had fewer measures to report (typical of specialist practices, but not multispecialty practices), then reporting was not going to be a big deal and would not require any substantial modifications to their data systems. This was useful information to obtain in the due diligence phase. 2. Adoption decision. In participating practices, the decision process was generally not complex or prolonged. In each case, PQRS was championed by administrative staff and agreed to at the board or medical director level. Participating administrators sought to make the decision as easy as possible for their doctors. Decisions to participate appeared to be influenced by three major factors: (a) practice characteristics, (b) the financial incentives PQRS offered, and (c) the commitments of practices to new quality initiatives. Practice characteristics. At the time of our study in 2011 (less than 4 years after PQRS was initiated), less than 20% of physicians serving Medicare beneficiaries participated in PQRS (CMS 2013). So in 2011, PQRS participants can be viewed as early adopters among all practices. In our convenience sample, certain organizational characteristics were more consistently present among practices that had chosen to participate in PQRS. As noted earlier, practices that were larger, more sophisticated in their administrative infrastructures, and more advanced in having EHR systems were more likely to participate than other practices. This result is consistent with studies that have found that PQRS and quality reporting efforts are generally more practical when a more substantial administrative infrastructure is available (Halladay et al., 2009; Leas, Goldfarb, Browne, Keroack, & Nash, 2009), including EHR (Berman et al., 2013). Administrators in smaller, disproportionately physicianowned practices also appeared to have a more cautious, wait-and-see attitude. (Eighty-one percent of practices with 11Y50 physicians and 89% of practices over 50 physicians chose to participate, but only 44% of practices with 10 or fewer physicians, as shown in Table 2.) These smallerpractice administrators also tended to frame the adoption issue in financial terms, with an emphasis on potential obstacles, whereas those in larger and hospital-owned practices included QA along with financial considerations in their decision-making processes. Incentives to participate. Few practices would have gone to the trouble of PQRS reporting without some reward (or avoidance of a penalty). As of 2011, there were rewards: a bonus that varied from 1.0% to 2.0% of the practice’s Part B allowed charges over the 2007Y2011 period. In part, practices that chose to participate were influenced by the financial bonusVpractice managers convinced physicians that they should not ‘‘leave money on the table.’’ For practices that felt reporting would not impose an additional burden on physicians, this would indeed have been money

Copyright © 2016 Wolters Kluwer Health, Inc. All rights reserved.

150

Health Care Management Review

left on the table. For example, in practices for which the measures were easy to generate out of a centralized billing system, the money seemed easy to obtain, and the bonus encouraged participation. Nonparticipants tended to view the incentives differently. Many such practices thought that it would not be so difficult to implement PQRS in their practices, but even so, as noted by one nonparticipating practice manager, the payment was not sufficient to cover the costs of participation. Three practices said that the payment was not enough to convince most practices to participate, although one practice said that neither the money incentives nor the professional incentives were sufficient: ‘‘It just isn’t a large enough part of a practice budget. Participation in quality programs is more often based on professional motives and is local. CMS is too far away from what is happening.’’ Still others seemed to assume that the payment was not sufficient, although they were generally unable to provide estimates of how much they would be paid. Administrators who chose not to participate were emphatic that they are more likely to participate when nonparticipation carries a financial penalty, although some have few Medicare patients. The penalty begins at 1.5% in 2015 and increases to 2.0% in 2016 and subsequent years. Penalties apparently will prompt more practices to participate than comparably sized bonuses have done, although the penalties and bonuses are economically equivalent at the margin. Nonparticipating administrators also indicated that, when they get EHRs with the functionality to make this reporting easy, they will be more likely to join. One nonparticipant noted that, ‘‘this will always be a low priority until the software makes it simple,’’ given the press of other priorities. Commitments of practices to new quality initiatives. Although incentives (and whether they come in the form of bonuses or penalties) apparently are important to all practices, participating practices also seemed more consistently to embrace the language of ‘‘innovation.’’ Participating practices depicted the reporting of performance as ‘‘the future.’’ As one participant put it, ‘‘the writing is on the wall’’ for quality reporting, and these practices wanted to participate to prepare for that future. ‘‘Eventually we’re going to have to do this anyway; we might as well do it now when there’s an incentive payment rather than a penalty for nonparticipation.’’ This is consistent with summary observations in the literature about decisions to adopt (e.g., Pallarito, 2012; Stulberg, 2008). By contrast, nonparticipating practices discounted the innovation PQRS represented, depicting it instead as a burden. 3. Implementation phase. The third phase of the innovation process is implementation. Administrators who chose to adopt PQRS described the implementation process as fairly straightforward, especially those who used a registry (although one practice had to change registries to make it

AprilYJune & 2016

work). The implementation experience appeared to vary by the type and complexity of the organization as well as by fortuitous fits or misfits between practice billing/information systems and PQRS requirements and processes (a point also emphasized by Halladay et al., 2009). However, none of these administrators viewed PQRS as a really difficult program to implement in other respects. Implementation was easier for larger practices with central billing offices. An administrator from a large practice with multiple clinic sites (including primary care and multiple specialties) indicated that there was nothing technically difficult getting the program implemented across the system. His organization had the capacity to effectively implement technical innovations. However, others noted that the problems were simply the ordinary difficulties of implementing something new: in the words of one, ‘‘Ithis was something differentV[it was] ‘change’, and so it was hard.’’ In contrast, a small practice said that anyone considering participation should be prepared to hire additional staff. ‘‘The process is complex and [our administrative staff are] already overworked and therefore participation is probably not ‘worth it’ financially for small practices.’’ But this latter point of view was rareVand again underscores the greater challenge faced by administrators who lack substantial administrative capacity for the new reporting demands. The lack of administrative capacity of course affects much more than PQRS participation. These smallpractice administrators face similar difficulties in addressing the new reporting and other administrative demands being imposed by payers and agencies of all kinds. The problem of ‘‘adapting’’ to PQRS was more complex for multispecialty practices than for single specialties. Single specialties needed to report only a few measures. By comparison, multispecialty practices often needed to report ‘‘20 to 30 measures or more.’’ In such cases, administrators had to enlist doctors to sort through the measures and choose those that made the most sense for the practice (in one case leading to a physician backlash, as some doctors complained ‘‘this is reporting, not qualityI,’’ a recurrent theme in our interviews). Although problems were greater for the large multispecialty practices, those practices typically had more administrative and training resources to deal with them. Most administrators who had EHR available viewed PQRS as a simple data processing adaptation. For example, in one roundtable interview, nine of the 10 practices had EHR, and all of those agreed that EHR makes PQRS easier. But there were variations in just how easy. In some cases, a minor administrative tweaking of the system was all that was required. In others, in the words of one administrator, the ‘‘technology wasn’t there’’ in his EHR to make the connection to PQRS easy. In another, PQRS could be implemented immediately at practice sites with EHR, but the ‘‘paper practices’’ could only be brought into the system gradually over the following 2 years. One geriatric practice

Copyright © 2016 Wolters Kluwer Health, Inc. All rights reserved.

151

Role of Group Practice Administrators in PQRS

with nursing home providers found it especially difficult to bring them into the reporting system, as nursing home records were often incomplete. This suggests that, for all the complexity of the program, the actual reporting process was quite straightforward, once administrators understood the PQRS requirements and once needed systems adjustments were madeValthough administrators faced more difficult adjustments depending on (a) the degree of fit between their existing data systems and PQRS, (b) the type of provider (nursing homes could be a particular problem), and (c) how well the PQRS measures fit the practice (multispecialty groups faced a more complex process of selection). These problems did not seem to arise from any particular flaws in the design of the PQRS process as suchVthey would be true of almost any reporting system of this typeVbut provided cautionary notes to other practice administrators to explore how their billing and other systems meshed with the particulars of PQRS before attempting to implement the program. 4. Learning process and feedback phase. The fourth phase of the innovation process is learning and feedback from the new process. Much of the learning was unexceptional. In general, administrators found that PQRS is a simple system to operate, once up and running. The program seems to work well and does not require much effort to keep it going, once measures are understood, selected, and implemented. Administrators had a variety of incidental comments and complaints, as might be common to any reporting system. For example, various practice administrators noted that it is often hard to determine which beneficiaries were eligible for a particular measure and hence to determine the denominators for some measures. There were also complaints that when a physician fails to complete all of the information, the billing agency sends the bill back for correction, creating more workload. However, only one practice in our sample dropped out of PQRS after initially joining. Thus, PQRS succeeded in establishing a process of selective reporting of quality measures by a large number of physicians, indeed the largest such reporting effort in the country, albeit limited to a minority of Medicare physicians through 2011. But to what end? Will PQRS actually help to improve the quality of care that Medicare patients receive? One national survey of physicians suggests physicians generally believe that PQRS has had no effect on quality (e.g., Federman & Keyhani, 2011). Duszak and Saunders (2010) find similar results from a small survey of radiologists. A large quantitative analysis in our study reaches a somewhat different result, finding some statistically significant effects for measures of avoidable utilization of health care services (Dowd et al., 2012). Practice managers provide a third perspective on the effect of PQRS on quality improvement. In our study,

administrators thought PQRS did not give them adequate support for internal QI discussions. That does not mean PQRS had no effect, only that the administrators were aware of the limits of PQRS for QI purposes and frustrated to some degree by how PQRS failed to facilitate internal quality discussions, as discussed more fully below. Some practices viewed PQRS measures as process based and not really getting at the issues physicians cared about, thus limiting how administrators could use the results. But this point of view was not unanimous: the administrator for one radiology practice indicated that his physicians found some measures useful and some meaningless. PQRS measures often were thought to lag behind what is happening in the private sector. (Note that, although virtually all PQRS measures are rooted in consensus decisions from professional societies and formal public procedures, CMS’ formal procedures for selection tend to delay their incorporation into PQRS.) But the problem goes beyond the measures. CMS feedback to the practices typically was not user-friendly for improving the quality of patient care, thus making it especially difficult for administrators or physicians to use the results to engage clinical staff for internal QI purposes. First, at the time of our interviews, CMS did not provide practices any benchmarks for how their performance compared to others in their region or nationally. A number of administrators expressed considerable interest in seeing how their practice compared on reported measures against norms for their community or comparable practice types nationally. Administrators felt that, if they could ‘‘keep score’’ with PQRS reports, they could use PQRS more effectively to animate clinician behavior. (CMS is now beginning to do this.) Second, a few administrators suggested that their physicians think they already deliver high quality, even if it is not measured. That attitude raises the question of whether some physicians need to increase their demand for the information that CMS (and others) are seeking to supply, however imperfectly. Third, PQRS reporting is at the individual physician level, but payment is at the practice or organization level, making it difficult to track payments to individual physician performance. One administrator ‘‘stopped trying to work back [from the reports] to individual physicians.’’ In some large multispecialty practices with 20Y30 measures, it was hard to trace performance to specific physicians. For all these reasons, it was difficult for administrators to take actionable steps in response to the feedback. Berman et al. (2013) reported a similar finding. Finally, there were serious time lags in reporting, especially for practices reporting PQRS measures through individual claims (almost 70% of all reporting practices in 2011). For these claims-reporting practices, CMS feedback reports were issued in the fall, for the year ending the prior December. Such delays made feedback reports ‘‘ancient

Copyright © 2016 Wolters Kluwer Health, Inc. All rights reserved.

152

Health Care Management Review

history,’’ almost impossible for practice managers to use to engage physicians in ongoing, internal quality improvement efforts. Note that more electronic-based reporting will speed this up in the future, once most practices are using registries and other electronic methods. According to our interviews at CMS, CMS’ goal is to get to near live-type feedback.

Discussion The main findings from our study are summarized in Table 3. The most important of these findings is simply that, in implementing PQRS, group practice managers not only mediated, but they took the lead. They performed or oversaw most of the administrative tasks of inquiry, adoption, implementation, and follow-up, with physician review but without substantial physician activity. Meanwhile, practices that chose not to participate did so largely because of practice managers’ assessments of burdens (sometimes, but not always seen as large) and benefits (which, for these nonparticipants, were uniformly viewed as small). Physicians of course were a critical audience, but here the practice managers were much more important to the adoption, implementation, and use of PQRS. Our past research has shown how difficult it is to gain consensus among group practice physicians about the adoption of new technologies and the use of those technologies once adopted. For example, in one of our earlier studies (Kralewski et al., 2007), 40% of the physicians in some practices were still not using electronic prescribing of drugs 4 years after their practices had adopted the technology. Although physicians are often reluctant to change habits and have a great deal of professional as well as organizational inertia that must be overcome before fully endorsing the PQRS program, the practice administrators appear to be very practical in their approach and very bottom-line oriented. Moreover, some of them sought to find ways to use the PQRS results internally, although the PQRS system frustrated them in that effort. If administrators see value in PQRS, they will be key to widespread adoption and will have an important, if not dominant, role in how the results are used. The respondents in our interviews were clearly playing leadership roles in their group practices and making some attempt to make use of the results. As more physicians consolidate into group practices, the role of administrator may expand accordingly to meet the environmental challenges. Medicare and PQRS could profitably build on this trend by devising ways to more actively engage administrators and assist them in what they are trying to accomplish. A final point. Our findings are from 2011. There have been changes in PQRS since thenVa change in incentives (there will soon be a penalty for nonparticipation, and

AprilYJune & 2016

Table 3

Summary of major findings of the PQRS focus groups 1. Practice administrators role: Group practice administrators are playing a more prominent role in quality-related activities than is commonly understoodVthey are not limited to the business side of the practice, especially in quality-related decisions that involve financial incentives. Quality assurance design should take account of this important role. 2. Need for research: Notwithstanding administrators’ importance in the adoption and implementation of quality assurance programs, little research now exists to understand their role. There accordingly is a need for more research in this area to guide quality assurance program design and management. 3. Need improved quality reports: Practice managers will be in an exceptional position to influence the quality assurance discussions of the practices, if quality reports are more usefulVin particular, if they are provided quickly after the care that’s reported, if results are benchmarked in relation to other practices, and if results can be traced back to individual physician performance. 4. Practice characteristics and early adoption: Practices in our sample that were larger, more sophisticated in their administrative infrastructures, and more advanced in their electronic health record systems were more disposed to be early adopters of physician quality reporting. 5. The ease of implementing quality reporting: The actual process of generating reports was more or less difficult, depending on (a) the degree of fit between their existing data systems and PQRS (a fortuitous good fit with electronic systems made it easier), (b) the type of provider (nursing homes could be a particular problem), and (c) how well the PQRS measures fit the practice (multispecialty groups faced a more complex process of selection). 6. Penalties will have a greater effect than bonuses: Practices that are not early adopters of physician reporting will be more motivated by penalties for nonreporting than they have been by comparably sized bonuses. This behavioral response suggests a guideline for quality assurance program designVthough initial years of bonuses may have been needed to gradually gain acceptance of the effort.

participation will make practices eligible for value-based payment bonuses), a continuing evolution in the PQRS measures, and some changes in how CMS reports the data, with more substantial changes in prospect according to our interviews at CMS. But even with these changes, we expect PQRS reporting to remain largely an administrative issue in the practices, with practice administrators taking the lead.

Copyright © 2016 Wolters Kluwer Health, Inc. All rights reserved.

153

Role of Group Practice Administrators in PQRS

If CMS provides reports with timely benchmarked resultsVa goal that participating practice administrators urged in our focus groups and CMS supported in interviewsVadministrators may be able to employ more user-friendly PQRS results to support QA discussions of the practices,.

Practice Implications and Conclusion PQRS will become widely implemented after 2014, when penalties are assessed for nonparticipation. Medical group practice administrators will be key to widespread adoption and will have an important role in how the results are used. Consequently, it is important for programs such as PQRS to be shaped in ways that enhance administrator’s roles and allow them to employ the results for internal QA purposes. Practices should take advantage of any opportunities CMS provides to that end. A doctor in one practice commented to his practice manager to the effect, ‘‘we should have been doing this a long time ago.’’ PQRS brought systematic quality reporting to a minority of all Medicare outpatient practices and promises to become more nearly universal in years ahead, as PQRS will be an increasingly important platform for quality reporting (and soon, for value-based payment) for a critical payer with whom virtually all administrators must deal. The influence of PQRS within practices may grow, as participation spreads, CMS refines the value of the reports it generates and begins to leverage payment bonuses on some of the results. Our findings provide insights into the influence of nonphysician administrators on the quality of patient care in medical group practices. We found that these administrators had a more direct role in PQRS decisions than expected, given the physician-centric traditions in these practice organizations. This has implications for several stakeholders, including CMS and other agencies attempting to improve the quality of health care. Although focusing on physicians and physician leadership is still central and even if physicians become more engaged in the program, our data indicate that they will rely on their administrators to integrate PQRS into their organization without undue burden. The role of practice administrators should be included in QA design along with physicians. Our findings also have implications for the administrators’ educational training programs. Those that focus exclusively on the business aspects of practice administration may find their graduates unprepared to work collegially with physicians at the boundaries of clinical issues to improve patient care. Finally, the findings from this small study indicate a need for more research focused on the rapidly evolving role of nonphysician administration in medical group practices. Most of the resources devoted to health care are controlled by physicians in medical group practices. But health care administrators may have a larger role than commonly understood and are in an exceptional position to influence

the more appropriate use of these resources if supported by more useful forms of quality reporting. Acknowledgments

This research on which this article is based was completed under Grant 1R01HS019964-01 to the University of Minnesota Division of Health Policy and Management from the Agency for Healthcare Research and Quality (AHRQ), with a subcontract to Simmons College. The authors would like to acknowledge the helpful assistance of Michael Harrison, PhD, and William Encinosa, PhD, of AHRQ; the Medical Group Management Association; the Centers for Medicare and Medicaid Services; and the editors and anonymous reviewers at the Health Care Management Review. The opinions expressed herein are solely those of the authors, who also are responsible for any errors. IRB Approval, University of Minnesota: IRB Study Number: 1009E88817 Principal Investigator: Dowd, Bryan E. Type: (E) Exempt Subtype: General; Approval Date: 10/25/2010

References Berman, B., Pracilio, V. P., Crawford, A., Behm, W. R., Jacoby, R., Nash, D. B., & Goldfarb, N. I. (2013). Implementing the physician quality reporting system in an academic multispecialty group practice: Lessons learned and policy implications. American Journal of Medical Quality, 28(6), 464Y471. Centers for Medicare & Medicaid Services. (2013). 2011 Reporting experience including trends (2008Y2012): Physician quality reporting system and electronic prescribing (eRx) incentive program. Retrieved from http://www.cms.gov/Medicare/ Quality-Initiatives-Patient-Assessment-Instruments/PQRS/ index.html?redirect=/PQRI Dowd, B., Karmarker, M., Swenson, T., Parashuram, S., Kane, R., Coulam, R., & Jeffrey, M. (2013). Emergency department utilization as a measure of physician performance. American Journal of Medical Quality, 29, 135Y143. Dowd, B., Jeffery, M., Kane, R., Knutson, D., Parashuram, S., Swenson, T., I Coulam, R. (2012). Physician quality reporting and patient outcomes in Medicare: Final report. Report submitted pursuant to AHRQ Grant No. 1R01HS019964-01. Duszak, R., & Saunders, W. (2010). Medicare’s physician quality reporting initiative: Incentives, physician work, and perceived impact on patient care. Health Policy, 7, 419Y424. Federman, A. D., & Keyhani, S. (2011). Physicians’ participation in the physicians’ quality reporting initiative and their perceptions of its impact on quality of care. Health Policy, 10, 229Y234. Friedberg, M. W., Coltin, K. L., Safran, D. G., Dresser, M., Zaslavsky, A. M., & Schneider, E. C. (2009). Associations between structural capabilities of primary care practices and performance on selected quality measures. Annals of Internal Medicine, 151, 456Y463. Gans, D., Kralewski, J., Hammons, T., & Dowd, B. (2005).

Copyright © 2016 Wolters Kluwer Health, Inc. All rights reserved.

154

Health Care Management Review

Medical groups’ adoption of electronic records and health information systems. Health Affairs, 24, 1323Y1333. Halladay, J. R., Stearns, S. C., Wroth, T., Spragens, L., Hofstetter, S., Zimmerman, S., & Sloane, P. D. (2009). Cost to primary care practices of responding to payer requests for quality and performance data. Annals of Family Medicine, 7, 495Y503. Institute of Medicine. (2001). Crossing the quality chasm: A new health system for the 21st century. Washington, DC: The National Academies Press. Kaissi, A., Kralewski, J. E., Curoe, A., Dowd, B., & Silversmith, J. (2004). How does the culture of medical group practices influence the types of programs used to assure quality of care? Health Care Management Review, 29, 129Y138. Kimberly, J. R., & Evanisko, M. J. (1981). Organizational innovation: The influence of individual, organizational, and contextual factors on hospital adoption of technological and administrative innovations. Academy of Management Journal, 24(4), 689Y713. Kocher, R., & Sahni, N. R. (2010). Physicians versus hospital as leaders of accountable care organizations. New England Journal of Medicine, 363, 2579Y2582. Kralewski, J., Dowd, B., Knutson, D., Savage, M., & Tong, J. (2013). Medical group practice characteristics influencing inappropriate emergency department and avoidable hospital rates. Journal of Ambulatory Care Management, 36, 286Y291. Kralewski, J. E., Dowd, B. E., Gans, D., Malakar, L., Elison, B., & Cole-Adeniyi, T. (2007). Factors influencing physician use of clinical electronic information technologies after adoption by their medical group practices. Health Care Management Review, 33, 361Y367 . Kralewski, J. E., Dowd, B., Heaton, A., & Kaissi, A. (2005). The influence of the structure and culture of medical group practices on prescription drug errors. Medical Care, 43, 817Y825. Kralewski, J., Pitt, L., & Shatin, D. (1985). Structural characteristics of medical group practices. Administrative Science Quarterly, 30, 34Y45.

AprilYJune & 2016

Leas, B. F., Goldfarb, N. J., Browne, R. C., Keroack, M., & Nash, D. B. (2009). Ambulatory quality improvement in Academic Medical Centers: A changing landscape. American Journal of Medical Quality, 24(4), 287Y294. Pallarito, K. (2012). Audiology dismisses quality initiative as confusing, but penalties will compel compliance. The Hearing Journal, 65, 13Y16. Scholle, S. H., Roski, J., Dunn, D., Adams, J., Pillitterre, D., Paulson, L. G., & Kerr, E. (2009). Availability of data for measuring physician quality performance. American Journal of Managed Care, 15, 67Y72. Shortell, S. M., Schmitt, J. A., Wang, M. C., Li, R., Gillies, R. R., Casalino, L. P., I Randall, TG. (2005). An empirical assessment of high performing physician organizations: Results from a national study. Medical Care Research and Review, 62, 407Y434. Solberg, L. I., Asche, S. E., Shortell, S. M., Gillies, R. R., Taylor, N., Paulson, L. G., I Young, M. R. (2009). Is integration in large medical groups associated with quality? The American Journal of Managed Care, 15, e34Ye41. Solberg, L. I., Crain, A. L., Tillema, J., Scholle, S. H., Fontaine, P., & Whitebird, R. (2013). Medical home transformation: A gradual process and a continuum of attainment. Annals of Family Medicine, 11, Supplement: S108YS114. Stearns, T. (1999). How physician/administrator teams work in small groups. Medical Group Management Journal, 46(44Y48), 50. Stulberg, J. (2008). The physician quality reporting initiativeVa gateway to pay for performance: What every health care professional should know. Quality Management in Health Care, 17, 2Y8. Van de Ven, A. H., Polley, D. E., Garud, R., & Venkataraman, S. (1999). The innovation journey. Oxford, England: Oxford University Press. Welch, W. P., Cuellar, A., Bindman, A., & Stearns, S. (2013). Medicare Physician Group Practice Database, ASPE/DHHS, presentation at the AcademyHealth Annual Research Meeting, Baltimore, MD, June 25, 2013.

Copyright © 2016 Wolters Kluwer Health, Inc. All rights reserved.

The role of medical group practice administrators in the adoption and implementation of Medicare's physician quality reporting system.

Although there are numerous studies of the factors influencing the adoption of quality assurance (QA) programs by medical group practices, few have fo...
348KB Sizes 0 Downloads 4 Views