Practical Radiation Oncology (2011) 1, 33–37

www.practicalradonc.org

Original Report

The development of oncology treatment guidelines: an analysis of the National Guidelines Clearinghouse Manisha Palta MD, W. Robert Lee MD ⁎ Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina Received 15 July 2010; revised 8 September 2010; accepted 8 September 2010

Abstract Purpose: In the last 2 decades, guidelines have been developed to improve quality of patient care. A recent editorial of guideline development procedures suggested the process has significant limitations that affect their scientific validity. 1 This prompted us to review oncology treatment guidelines to determine if such limitations are widespread. Methods and Materials: We performed a review of oncology treatment guidelines registered at the National Guidelines Clearinghouse (www.guideline.gov). Each guideline was independently reviewed by 2 authors and the following criteria were assessed: coordinating organization, guideline panel composition, reporting conflict of interest, peer review, dissent, expiration date, PubMed citation, and evidence-based scoring and grading of recommendations. Disagreements were resolved by consensus in subsequent discussions. Results: Sixty-four guidelines were reviewed (39 [61%] were developed by a medical specialty society and 25 [39%] were developed by government agencies). Fifty (78%) guideline panels were multidisciplinary and 44 (69%) included individuals with epidemiologic and health services research expertise. Potential conflicts of interest were disclosed in 43 (67%) guidelines. Sixty (94%) guidelines underwent peer review, with external review in 31 (48%). Seventeen (27%) guidelines are indexed by PubMed. Fifty-one (80%) guidelines included evidence-based methodologies and 46 (72%) used evidence-based scoring of recommendations. Significant differences were observed according to coordinating organization (eg, disclosure of conflict of interest in 46% of guidelines developed by medical specialty societies versus 100% authored by government agencies [P b.0001]). Conclusions: The majority of oncology-related treatment guidelines registered at the National Guidelines Clearinghouse satisfy most of the criteria for sound guideline development. Significant differences in these criteria were observed according to the coordinating organization that developed the guideline. © 2011 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

Introduction During the past 2 decades, clinical practice guidelines have been developed as a method to reduce Conflicts of interest: None. ⁎Corresponding author. Department of Radiation Oncology, Duke University Medical Center, Durham, NC 27710. E-mail address: [email protected] (W.R. Lee).

variation in medical practice, improve the quality of health care, and control costs. The number of guidelines has dramatically increased in an attempt to consolidate the extensive amount of medical literature and provide recommendations in areas where evidence remains inconclusive. Although major medical organizations have proposed procedures to ensure the development of scientifically sound guidelines, a

1879-8500/$ – see front matter © 2011 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved. doi:10.1016/j.prro.2010.09.003

34

M. Palta, W.R. Lee

Practical Radiation Oncology: January-March 2011

report published 10 years ago concluded that many guidelines failed to adhere to established methodological standards.2 Recent commentators have suggested that the guideline development process has significant limitations and should be reformed.1 An editorial by Sniderman and Furberg1 outlined several limitations of the guideline development process, including lack of diversity within guideline panel, perception of unanimity of guidelines, lack of independent review, and failure to report conflict of interest (COI). Although the authors principally referenced examples from the lipidemic literature, the purpose of our study was to systematically determine if such limitations are pervasive in oncology treatment guidelines.

Materials and methods Guidelines were identified by a computerized search of the National Guidelines Clearinghouse (NGC).3 A search was performed on February 16, 2009, using the following search terms: “cancer,” “oncology,” and “treatment.” Each guideline was independently evaluated by the 2 authors (MP, WRL), and a standardized data collection form was used to assess information from each guideline. Factors evaluated included coordinating organization, composition of the guideline writing group, reporting COI, expiration date of guideline, internal and external peer review, unanimity of the guideline, PubMed citation, and use of evidence-based scoring and grading of recommendations. Disagreements were resolved by consensus in subsequent discussions. Chi-square analysis was performed comparing the subset of guidelines authored by medical specialty societies and professional organiza-

Table 1

tions to government agencies for aforementioned assessment criteria.

Results The results of the computerized search produced 99 guidelines. Many guidelines were outside the scope of our review and were removed from consideration (eg, human immunodeficiency virus, acute liver failure, urinary incontinence, and others). A total of 64 guidelines was reviewed and the criteria assessed are shown in Table 1. Kappa coefficient of agreement was 0.84. Approximately two-thirds of guidelines were in reference to radiation oncology practices. Thirty-nine (61%) were developed by a medical specialty society and professional organization and 25 (39%) were developed by a government agency (local, state, or national). Fifty (78%) guideline writing groups were multidisciplinary in nature, and 44 (69%) included individuals with epidemiologic and health services research expertise. Potential COI was disclosed in 43 (67%) guidelines. Expiration or updates to guidelines were present in 35 (55%) guidelines. Sixty (94%) guidelines reported some form of peer review, with external review in 31 (48%) guidelines. Only 8 (13%) of the guidelines provided some measure of alternative viewpoint along with the majority opinion. Seventeen (27%) of the guidelines are indexed by PubMed. Fifty-one (80%) guidelines included evidence-based methodologies and 46 (72%) used evidence-based scoring of recommendations. There were significant differences for many of these criteria, according to the guideline coordinating organization (Table 1). Government agency–sponsored guidelines were more likely to have a multidisciplinary panel (P = .0139), including individuals with backgrounds in epidemiologic and health services research (P = .0005),

Criteria assessed in oncology treatment guidelines

Multidisciplinary panel Outside expertise COI disclosure Expiration date Peer review External peer review Dissent PubMed citation Scoring of evidence Grading of evidence COI, conflict of interest.

All guidelines (N = 64)

Medical specialty society/professional organization (n = 39)

Government agency (n = 25)

v2 (P value)

50 (78k) 44 (69k) 43 (67k) 35 (55k) 60 (94k) 31 (48k) 8 (13k) 17 (27k) 51 (80k) 46 (72k)

26 20 18 12 35 7 6 17 26 22

24 (96k) 24 (96k) 25 (100k) 23 (92k) 25 (100k) 24 (96k) 2 (8k) 0 (0k) 25 (100k) 24 (96k)

.0139 .0005 b.0001 b.0001 .2608 b.0001 .6283 .0004 .0036 .0016

(67k) (51k) (46k) (31k) (90k) (18k) (15k) (44k) (67k) (56k)

Practical Radiation Oncology: January-March 2011

compared to guidelines by a medical specialty society or professional organization. Government agency–authored guidelines routinely disclosed COI (P b.0001), included an expiration Web site update (P b.0001), underwent external peer review (P b.0001), used evidence scoring (P = .0036), and graded level of evidence (P = .0016). Guidelines created by a medical specialty society or professional organization, or both, were more likely to be cited on PubMed (P = .0004).

Discussion This simple systematic review of oncology-related treatment guidelines registered at the NGC suggests that the guideline limitations presented by Sniderman and Furberg1 are not widespread in the field of oncology. Our review does demonstrate, however, that a substantial number of guidelines falls short. Furthermore, we observed important differences according to the coordinating organization. In the discussion that follows, we will elaborate on each of the potential limitations described by Sniderman and Furberg.1 The requisite membership of guideline groups should be clearly defined and include experts in epidemiology, biostatistics, economics, and health policy. Our review determined that (in nearly all cases) the composition of the guideline writing group was explicitly stated, the majority (78%) of guideline panels were multidisciplinary in nature, and more than half (69%) included epidemiologic and health services research expertise. Individuals with such backgrounds provide an alternate perspective that is important, and we encourage panels to include individuals with this expertise in the guideline development process. Some authors have documented the influence of gifts on individual physicians.4,5 Pharmaceutical companies and manufacturers of medical devices have a vested interest in physician diagnostic and treatment decisions.6 A study evaluating the relationship between the pharmaceutical industry and guideline authors found that 87% had some formal connection to pharmaceutical companies.7 In our review, potential COIs were disclosed by authors in only two-thirds of the guidelines. Even when a disclosure of COI is made, there can be varying interpretations by the coordinating organization of what constitutes COI. We suspect that most organizations consider a simple declaration of all financial relationships sufficient. We agree with the suggestion by Sniderman and Furberg1 that all relationships should be disclosed in detail (with amounts received) and that this information should be publicly available. Furthermore, we encourage organizations to require disclosure of any financial benefits that accrue after development of the guideline.

Oncology treatment guidelines

35

A slight majority of guidelines included an expiration date for recommendations. A study conducted to assess the current validity of clinical practice guidelines published by the United States Agency for Healthcare Research and Quality found that 50% of guidelines were outdated in approximately 6 years and that guidelines should be reassessed for validity every 3 years.8 The NGC removes guidelines from the database that have not been developed, reviewed, or revised in the last 5 years. Many authorship groups that provided expiration performed an annual review or cited Web sites that had updates to guidelines that were available for review. Sniderman and Furberg1 also highlight the importance of external peer review and state that this step is frequently bypassed in the development of guidelines. External peer review is one avenue through which minority opinion can be incorporated into a final draft of guidelines. Although most guidelines underwent some form of peer review, a higher standard of external peer review was applied in only 48%. Some panels discussed guidelines in an open forum at a well-attended medical specialty meeting, whereas others posted drafts online for individuals to submit comments. All guidelines should include external peer review to validate guidelines and allow an opportunity for differences in opinion to be recognized. Similar to external peer review, which allows for varying interpretations of scientific evidence, dissent or minority opinion can dispel physicians of the notion that unanimity exists. Law is like medicine in that it has many areas of gray. Within the legal realm, when a majority opinion is issued, a dissenting, minority opinion is often on record, which can illuminate core areas of disagreement and facilitate future discussion. The vast majority of medical guidelines that we reviewed did not provide a minority or alternative opinion, which gives the impression of consensus. As Sniderman and Furberg1 stated, “unanimity is obviously a tactic, not a necessary result.” Scientific knowledge is inherently incomplete and evolving. Guidelines frequently are motivated by a lack of clear evidence in the clinical realm; the fact that 87% of guidelines do not provide any measure of dissenting point of view is concerning. Some panels, such as the American College of Radiology, used an “appropriateness” scaling from 1 to 9 through which individuals, based on quality and quantity of evidence, could rate guidelines. Some researchers have observed that different organizations can establish separate guidelines that directly contradict one another.9,10 Guidelines are frequently viewed as the final arbiter of care by clinicians and payors alike; organizations that develop guidelines should use processes that make it clear when unanimous scientific consensus is absent. Although the primary objective of this study was to examine the guideline development process

36

M. Palta, W.R. Lee

focusing on the limitations highlighted by Sniderman and Furberg,1 we did collect information on one methodological component—namely, the use of evidence grading and scoring of recommendations. In a more comprehensive review of guidelines published a decade earlier, only 13% provided grading of recommendations.2 We were encouraged that now nearly three-quarters (72%) of the guidelines meet this standard. For unclear reasons, some panels provided a rating scheme to score the strength of evidence yet failed to apply defined grading to proposed recommendations. Grading allows physicians to quickly assess the level of evidence for specific recommendations. We propose that all guidelines provide grading and scoring of recommendations. An exploratory analysis was conducted to examine differences in adherence to criteria between guidelines coordinated by medical specialty societies or professional organizations and government agencies. A higher percentage of government-sponsored guidelines routinely fulfilled the areas we evaluated (Table 1). These differences may be related to the demands of the guideline coordinating organization. Government agencies may potentially be more stringent in COI disclosure and create mechanisms for guideline renewal and updates. Government agencies are also more likely to be concerned with cost-effective care, a goal more likely to be met by including individuals of multiple medical specialties and individuals with epidemiologic backgrounds, and by implementing a process of external peer review. Although government agencies satisfied more criteria, significantly more guidelines created by professional organizations were cited in PubMed. Many medical specialty societies have a corresponding journal that is organization-sponsored, possibly accounting for a higher number of PubMed citations. Overall, government-sponsored guidelines more consistently adhered to the criteria we evaluated. This study has several limitations. First, the computerized search was performed on the NGC and not a more comprehensive database (eg, PubMed). The decision was motivated by the idea that peer review was not a requisite for registration of a guideline at the NGC and the resulting products may represent the “worst case scenario” with respect to the limitations we were interested in evaluating. We assumed that the peer review process that is required for most entries in PubMed would address potential limitations of the guideline development process; however, this assumption may be incorrect. We also accepted the information provided from the NGC as valid (eg, if a statement from the NGC was issued that COls were disclosed to the coordinating organization, we accepted this information at face value). In addition, our study population of oncology treatment guidelines was

Practical Radiation Oncology: January-March 2011

somewhat of a moving target. Performing an identical search a few months after the initial accession yielded 88 guidelines instead of our initial 99. This may be secondary to guidelines being outdated and removed from the Web site. In addition, with one exception, we did not collect information on the methodological rigor of the guidelines, as this was not our primary purpose. Nor did we, unlike others, assess the impact that guidelines have on clinical and economic outcomes.11 Finally, it is possible that there are biases in the method from which the information was retrieved, although it is unlikely, because the level of agreement for the 2 independent reviewers was high. Although our review revealed that a majority of oncology treatment guidelines met our predefined assessed criteria, many fell short. Government agencies fulfilled our assessed criteria more consistently. As many guidelines tout the sponsorship of major medical organizations, they garner the illusion of being the “epitome of evidence based medicine.”12 Clinicians should keep this in mind when using guidelines to make decisions regarding patient treatment and care.

Conclusions With the expansive amount of medical literature available, many practitioners turn to guidelines for recommendations on delivery of health care. Our review of oncology treatment guidelines reveals that while the majority of such guidelines satisfies our predefined criteria, a substantial number falls short. Governmental agencies adhere to these standards more consistently than medical specialty societies and professional organizations. Although guidelines are routinely used as tools to aid in diagnostic and treatment decisions regarding patient care, we, as clinicians, must demand a higher standard.

References 1. Sniderman A, Furberg C. Why guideline-making requires reform. JAMA. 2009;301:429-431. 2. Shaneyfelt T, Mayo-Smith M, Rothwangl J. Are guidelines following guidelines? The methodological quality of clincal practice guidelines in peer-reviewed medical literature. JAMA. 1999;281: 1900-1905. 3. National Guidelines Cleainghouse web site. Available at: http:// www.guideline.gov. Accessed February 16, 2009. 4. Dana J, Loewenstain G. A social science perspective on gifts to physicians from industry. JAMA. 2003;290:252-255. 5. Wazana A. Physicians and the pharmaceutical industry: is a gift ever just a gift? JAMA. 2000;283:373-380. 6. Rothman DJ, McDonald WJ, Berkowitz CD, et al. Professional Medical Associations and their relationships with industry. JAMA. 2009;301: 1367-1372. 7. Shaneyfelt TM, Centor RM. Reassessment of clinical practice guidelines: go gently into that good night. JAMA. 2009;301:868-869.

Practical Radiation Oncology: January-March 2011 8. Shekelle PG, Ortiz E, Rhodes S, et al. Validity of the Agency for Healthcare Research and Quality Clinical Practice Guidelines: how quickly do guidelines become outdated? JAMA. 2001;286:1461-1467. 9. Burgers JS, Bailey JV, Klazinga NS, et al. AGREE Collaboration. Inside guidelines: comparative analysis of recommendations and evidence in Diabetes Guidelines from 13 countries. Diabetes Care. 2002;25:1933-1939.

Oncology treatment guidelines

37

10. McAlister FA, van Diepen S, Padwal RS, et al. How evidence based are the recommendations of evidence based guidelines? PLoS Med. 2007;4:1325-1332. 11. Cook D, Giacomini M. The trials and tribulations of clinical practice guidelines. JAMA. 1999;281:1950-1951. 12. Tricoci P, Allen J, Kramer JM. Scientific evidence underlying the ACC/AHA Clinical Practice Guidelines. JAMA. 2009;301:831-841.

The development of oncology treatment guidelines: an analysis of the National Guidelines Clearinghouse.

In the last 2 decades, guidelines have been developed to improve quality of patient care. A recent editorial of guideline development procedures sugge...
117KB Sizes 0 Downloads 3 Views