Lamk
M. Lamki,
MD
#{149} Thomas
P. Haynie,
Quality Assurance Department’ This article reviews the general principles of quality assurance (QA) in an imaging department, with emphasis on nuclear medicine. The various steps taken during the development of the QA program reflect the response of the QA committee as it came to a better understanding of the components of QA. Accrediting and regulatory bodies have had important roles in providin8 guidance. Quality control of instrumentation and radiopharmaceuticals opened the gateway to monitoring in high-volume, high-risk areas; however, QA expanded this concept into better generic and dinical monitoring. Encouragement of the use of quality of care referral forms resulted In greater participation by all members of the department. Examples of physicians’ QA activities include double reading of images and sending of code cards. Experience with other forms of physician QA activities is also included. The QA committee provides a forum for five steps of QA: identify problems, assess the causes, implement action to prevent them, monitor effects of the actions, and document these activities. These steps should lead to improvement in the standards of patient care. Index terms: Radiology partmental management ing, quality assurance. Radiology
1990;
and radiologists, deRadionuclide imagState-of-art reviews #{149}
177:609-614
1 From the Department of Nuclear Medicine, Division of Diagnostic Imaging, University of Texas M.D. Anderson Cancer Center, Houston. Received May 1, 1990; revision requested June 18; revision received and accepted July 2. Address reprint requests to L.M.L., Division of Nuclear Medicine, University of Texas Medical School at Houston, Hermann Hospital, 6411 Fannin, Houston, TX 77030. #{176}RSNA, 1990
MD
#{149} Donald
A. Podoloff,
in a Nuclear
T
MD
HE
entering
into
the
Edmund
Kim,
MD
Medicine
role of quality assurance (QA) in departments of nuclear medicine is currently gaining great importance. This is particularly evident from the recent activities of the Joint Commission on Accreditation of Healthcare Organizations (JCAHO), the Nuclear Regulatory Commission (NRC), and state and other regulatory agencies. The JCAHO has recommended general guidelines in the form of specific “standards” for QA activities in departments of nuclear medicine (1). These guidelines also provide for institutional QA committees, which in turn place certain requirements on the individual departments. The NRC is concerned about the incidence of radionuclide misadministration and therefore has proposed new rules as amendments of 10 CFR Part 35 that would require medical use licensees to establish and implement a basic QA program. This was published in the Federal Register of January 1990 (2,3). The establishment of the new National Practitioner Data Bank is an example of the state boards
#{149} E.
regulations
of QA programs, specifically of medical practitioners. Under the JCAHO guidelines, responsibility is assigned to the individual departments and divisions, under the guidance of their respective chairmen and chiefs, to define the scope of practice and set up acceptable QA activities in departments of nuclear medicine. This will improve the standard of patient care and also protect nuclear medicine from some of the less reasonable regulations created by authoritative bodies that are not directly involved in the day-to-day running of an imaging department. QA refers to the steps designed to (a) identify problems or potential problems, (b) assess the causes of these problems, (c) implement action designed to eliminate or prevent
them, and (d) monitor to ensure elimination, document
such
activities
the activities as well as (e) leading
to
the highest standard of patient care. An organized QA program should indude an appropriate quality control (QC) program for instrumentation, image quality, radiopharmacy, and techniques (4). These are the specific measures of the QC program designed to ensure implementation of that aspect of the QA program. Besides the QC program, a comprehensive QA program should also include a QA plan for the physicians and other staff in the department and a good system to monitor the department activities (5-7). The general guidelines for a QA program for nuclear medicine and diagnostic radiology departments are covered under the JCAHO standards QA.1-QA.4, NM.1-NM.4, and DR.1-DR.4 (1). The QC part of the QA program is now routine in many departments, but many departments still lag behind in physician QA activities. A comprehensive QA program should be tailored for a specific department under the general guidelines of the institution and with close cooperation of all the members of the department and clear communication with the institutional QA program. As discussed, several regulatory bodies are the driving forces behind a QA program of any department, but perhaps the most obvious of these bodies is the JCAHO. While the JCAHO offers certain guiding principles or “standards,” they do not set a specific outline for a QA program. They would much rather see the different departments set their own detailed program and show clear evi-
Abbreviations: JCAHO = Joint Commission on Accreditation of Healthcare Organizations, NRC Nuclear Regulatory Commission, QA quality assurance, QC - quality control, QCRF quality of care referral form, SPECT single photon emission computed tomography.
609
by any
dence that they are following their own standards as outlined in their own specific QA program. FORMATION
OF
THE
committee
QA
COMMITTEE Since 1984 we have had a formal QA committee at the University of Texas M.D. Anderson Cancer Center, Department of Nuclear Medicine, and in this article we will share with you our experience in establishing such a committee and how it has evolved in the past 6 years. A QA committee for the department of nuclear medicine was formed in February 1984. It comprised nuclear medicine physicians in the department, the radiopharmacist, the chief technologist (technical director), the nuclear physicist, the radiochemist, and the head nurse of the department. It was chaired by one of the nuclear medicine physicians (L.M.L.) who was not the chairman of the department. The goals of the committee were to set a QA program for the department and to provide a forum for open discussion of new ideas to improve the quality of care that we deliver to our patients. THE
EARLY
The emphasis our QA program cally QC of the pharmacy
QA
PROGRAM
in the early days of was on QC, specifiimaging and radio-
instrumentation,
as well
as
radiopharmaceuticals. The highlights of that QC program are outlined in the Table. Since 1984 it has been expanded to include a more comprehensive QC program for SPECT (8). In addition to (a) the QC program, we also had a QA program for (b) the nurses, (c) the technologists, (d) the radiopharmacists, and (e) the physicians. Each group had to report to the committee activities that, in their judgment, were outside the accepted standard of care or the routine of the department. This, unfortunately, resulted in the devotion of unequal time to the different members and sections of the department at the committee meetings. The program, however, did provide the means to do away with anxieties about reporting problems related to themselves, their section, or others, and the accompanying fears that a new QA program brings. The committee discontinued this aspect of the QA program after about 2 years and instead introduced the quality of care referral forms (QCRF). These forms (Fig 1) were to be used 610
#{149} Radiology
member and
of the
department-
non-committee
THE
UNWUSITY
mem-
bers alike-to report any incident that was not consistent with good conduct of patient care in the opinion of the reporting member of the department. The QCRF was processed internally and did not replace the Incident Report, which went to the institutional committee. This turned out to be a very successful program, and several forms were turned in at every weekly meeting of the committee by various members of the department, from the chairman of the department to the most junior technologist. All members were encouraged to use these forms to improve quality of care. In some cases, a QCRF was generated alone if an institutional Incident Report was not
TEXAS
000T=
CANCER
c..
QoI!*ti:pf R,,,,,
OF
011
F,.s
Th
O.’p’ .
Q,.1iy
f
warranted.
QCRFs Approximately 50% of the 1 1/2-hour QA committee meeting was devoted to the discussion of the QCRFs that were submitted. The committee met weekly in its infancy, and then this was reduced to biweekly. The action taken by the committee regarding the problem identified in the QCRF would then be transmitted back in the form of a letter to the person who reported the incident. This letter included a few words of appreciation from the committee chairman stressing the importance of the role that this person played as a “member of the team.” Several types of problems were reported on the QCRFs: (a) a patient was forgotten in the waiting room after injection of technetium-99m MDP for bone scanning and was not recalled for scanning within 2 hours; (b) a repeat bone scan study was performed within a few days although there were no obvious clinical indications; and (c) nuclear studies and radiographic studies were performed in the wrong sequence (ie, computed tomography [CT] with contrast material was performed before thyroid scanning). There were also some complaints about decisions taken by other staff members, including complaints about physicians. Discussion of such complaints was carefully conducted in a way that would not lead to hostility but would result in noticeable improvement in the quality of patient care. There is a QA officer in the department who is a radiologic technologist. Typically a QCRF filled out by a member of the staff is forwarded to
Figure 1. Example of the QCRFs used to report actual or potential patient care problems or to recommend improvements.
the
QA
tigation.
officer
for preliminary
The
QA
officer
inves-
investigates
the problem and, when indicated, reports his or her findings on the problems addressed in the QCRF to the appropriate
committee.
Action
is tak-
en by the department chairman either by the formulation of a new policy or a recommendation for a change in conduct of care to involved members of the department. Several improvements in patient care have resulted from the discussions of these QCRFs. We have found them to be most helpful. Some of the improvements in patient care that have been a direct result of the QCRF program are (a) more efficient and timely reporting of left ventricular gated blood pooi function studies and (b) automation of the clinical radiopharmacy record keeping. GENERIC
MONITORS
A good QA program must include ongoing monitoring of potential problem areas. In fact, the JCAHO standard NM.4 requires monitoring and evaluation of the quality and appropriateness of diagnostic and/or therapeutic nuclear medicine services (1). Our committee has conducted prospective generic monitors as suggested by the JCAHO. Examples of monitors that were conducted by this committee included a clinical monitor on “the bone scan” and another one on the nuclear medicine December
1990
CLINICAL
PUIITYL
OqstII
Phy.I.s
csYPOOl:
2 .
C,.,.,1NCAI,.’.
___________________
I4
_________
TYPE OF OI.A$INATIOII:
.dI,th.
f,
PATWEES
ILAO
UU
jpp
3 . CA..
PHYSICIANS’ I
AppC’.d
csco
‘
R,.CAd
_____ -
Figure
tors.
One example
2.
This
form
for a nuclear the the
monitored
0A F:
of clinical “the
87.t
moni-
requisition”
medicine examination. Note by the different members of
role played department
in this
monitor.
requisition and reporting form. When a generic monitor is confined to a specific problem within one dinical department, it is referred to as a “clinical monitor.” Monitors are usually carried out in high-risk, highvolume, or problem-prone areas as defined by statistics on incidents. There are four basic types of clinical monitors:
(a) resource
utilization,
(b) performance of staff, (c) outcome of effort, and (d) risk/complication indicators. JCAHO has a guide on how to set up a “monitoring and evaluation process for quality and appropriateness of care.” Typically, a clinical monitor is conducted for 2 weeks by using a form that has been tailormade for the problem area and approved by members of the committee (an example is shown in Fig 2). A monitor form is attached to each requistion for a nuclear medicine test that comes through the department for those 2 weeks, and it is filled out by appropriate staff members as it passes through the different areas of the department. We conducted a monitor for evaluation of the completeness of the requisition form (Fig 2): The clerk checks to determine whether the general information about the patient and the referring physician is provided. The other parts of the requisition are then filled out by the nurse and technologist who dispense the dose, inject it, and image the patient and by the physiVolume
177
Number
#{149}
QA
PROGRAM
p..
Sp.C,.1
T..C
cian who reports the study. For example, the nuclear physician checks for completeness of the clinical information provided and the necessity for and the timing of the test. The resuits are analyzed and presented to the committee, which takes action in the appropriate areas when improvement is needed.
3
Until recently, physicians were concerned mainly with monitoring the quality of performance of their staff and stressing QC, rather than evaluating their own performance in an organized QA program. However, this has now changed because the JCAHO is pressurizing physicians to conduct organized physician QA programs, to institute more peer review, and to monitor the appropriateness of care delivered. The recent emphasis of the JCAHO has been on physician performance and outcome rather than QC matters. Pressure is also coming from other regulatory agencies, as discussed. At the University of Texas M.D. Anderson Cancer Center, Department of Nuclear Medicine, we have been conducting a formal QA program for the physicians since the birth of the QA committee in 1984. Several of the items to be discussed have evolved gradually after the launching of a very humble program. With respect to formal physicians’ QA activities, we started off with blind double readings of 5% of all the studies obtained in the department. The filing clerk selected every 20th nuclear study obtained in the department after the formal interpretation and reporting had been completed. This was given to a second physician who interpreted it independently; this physician was blinded to the initial report but had a photocopy of the initial requisition from the clinician, thus giving the second reader the benefit of the same clinical history that the first reader had. Granted, this setup does not duplicate all the conditions of the original interpretation, but it can serve as a means to elicit discrepancies in image readings. One of the four physicians in the department was appointed to monitor and compare the two reports from the two readers; if there were any discrepancies, they were discussed in a separate physician’s QA subcommittee meeting until a consensus was reached. If the official report (first report) was judged to be in error, the
attending
clinicians
were
informed
of the new report that the nuclear physicians as a group had agreed on at the conference. It was the understanding of the physicians’ discussion group that if there was a major or recurring problem with interpretation of a particular image type, then 100% of these cases would be monitored until the problem was resolved. In practice, in very few instances did errors affect patient care. Most discrepancies were minor and reflected differences in threshold encountered in most multiple-interpretation studies. We continued double reading of at least 5% of the cases for about 4 years, but recently we elected to try other QA programs for physicians. This was partly due to the relatively low levels of disagreement involved and was partly in response to the controversy in the literature about the utility of double reading (9). In our experience, however, all four nuclear medicine physicians agree that it has been a very useful exercise to double read the cases, and we have learned a significant amount from each other during the discussions of our differences. Prerequisites of a beneficial double-reading program are that the physicians must be a small and harmonious group who get along well, have mutual respect for each other’s opinion, and are convinced of the importance of a good QA program. In the past year or so, we have adapted the system of sending “QA cards” (also referred to as “code cards”) to each other instead of performing double reading. When a physician discovers that another physician who read the previous study had missed a lesion, the first physician sends a card to the latter during the dictation of the report, pointing out the error. The original physician then brings the case report for discussion at the following physicians’ QA subcommittee meeting. The results of the discussion are summarized and reported to the main QA committee of the department. At this point, the department QA meetings were reduced to occur every other week rather than weekly. The procedure of sending QA cards to each other as a replacement for the double-reading QA activity has been active for nearly 2 years now, and we find it equally useful, although it does of necessity involve only follow-up cases. Cards are also sent for errors other than those of omission or commission; for example, they are also used to disseminate follow-up Radiology
#{149} 611
information of interesting cases that one physician may have obtained. We have found these cards to be a useful learning tool. For an effective QA program, we recommend use of either the card system or double reading for a department that has more than one physician. However, the card system is more demanding on the cooperation and goodwill between the clinicians than the double reading. Either program can significantly improve the quality of patient care. It can also point out individual weaknesses,
which
by continued Another
can
education QA
be addressed
or retraining.
program
that
we
tried
for a short period of time is the detailed clinical-pathologic-imaging correlation on a few select cases. While this may be a very good QA program for specific needs, we found it very taxing; therefore, we did not use it long enough to determine its advantages and disadvantages. BENEFITS
FROM QA
PHYSICIANS’
PROGRAM
A good QA program not only satisfies requirements of the JCAHO and other regulatory bodies but can significantly improve the physicians’ skills (10) and channel his or her energies into more productive “readout” sessions. So far we are confident that we have gained significantly from our physicians’ QA program. Among the procedures that have evolved from the regular discussions of our errors are the following: 1 We now regularly correlate nuclear medicine functional studies with results of anatomic imaging studies such as CT, ultrasound, and plain radiography. A nuclear medicine report is made with knowledge of the complete radiologic findings of the patient and a correlation with the other studies. 2. When previous similar studies (eg, bone scans) are available, they are routinely compared with the present study before the latter is reported. 3. All physicians involved carefully choose the wording of their reports during dictation, in order to make them succinct and more helpful to the clinicians. The report is organized into (a) a precise “conclusion,” which is as committed as possible, followed by (b) concise “comments” of the nuclear medicine findings with relevant differential diagnoses. A poor report may turn up at the physicians’ QA meeting for discussion specifically because of its
.
612
Radiology
#{149}
vagueness or the impact it could have on patient care and the unnecessary other tests it may have generated. 4. Greater emphasis is placed on image quality and the team effort needed to produce quality images. The physician now routinely investigates poor images by initially checking with the technologist and then the radiopharmacist; if necessary, the imaging instrument will be checked. The physician also reviews the patient’s medical records and/or interviews and examines the patient before the patient leaves the department, to ensure complete understanding of possible artifacts. The whole team is more acutely aware of quality imaging. It is not clear whether this program is appropriate for a one-person nuclear medicine department, but certainly some of the programs may be used by a one-person department. Clinical-pathologic-imaging correlation conferences may also be appropriate for a solo-physician department. Evaluation of surgical records and follow-up notes becomes much more important in such a one-radiologist department. Correlation of nuclear medicine findings with those of other imaging modalities, as we have described, would also be of particular benefit to a solo-physician department. PHYSICIANS’ AND
PRIVILEGES QA
Every physician who practices in a department of nuclear medicine is granted privileges for specific diagnostic or therapeutic procedures after submission of appropriate application and credentials. A physician is granted privileges only if the chairman of the department is convinced that the physician has the appropriate training in the specific procedure. Generally, for routine procedures this is easy, as all physicians have complete nuclear medicine training and are board certified; however, the difficulty comes when a new test is introduced in the department. We are still experimenting with different criteria for granting privileges for new tests, from the interpretation of functional SPECT images obtained with new radiopharmaceuticals to the introduction of positron emission tomography in the department. Use of the criteria of the NRC for active participation in a new procedure can be appropriate, as can an in-house training approach. Ultimately, some type of “gold standard” to evaluate
objectively the quality tion may be developed erable experience has RECORDS
OF
QA
of interpretaafter considbeen obtained. ACTIVITIES
The QA committee must keep detailed minutes of the meetings and of all QA activities. Documentation is an important component of any QA program. We have adopted the “Problem,” “Action Taken,” and “Results” format for recording the mmutes. Each problem presented to the committee is discussed and recorded in this format, with additional notes about “Follow-up” of the problem. These minutes are distributed among the members of the committee and are available to any other member of the department who wants to read them. The physicians’ QA subcommittee keeps its own detailed mmutes of the cases discussed, and only a summary is presented to the main departmental QA committee. The minutes from the departmental QA committee meeting contain summaries of all the reports about the physicians’, nurses’, technologists’, and radiopharmacists’ QA activities, as well as problems that were reported on the QCRFs. Every 3 months the mmutes are reviewed to ensure that old items have been acted on, and a complete follow-up is available for all the QA problems from the different members of the committee and also from the QCRFs. RELATIONSHIP INSTITUTIONAL
WITH QA
PROGRAM The department of nuclear medicine, together with the department of diagnostic radiology, make up the division of diagnostic imaging at this institution. The minutes of the QA committee are reviewed by the head of the diagnostic imaging division and are distributed to the institutional QA officer for discussion at the institutional QA committee meetings. All records are readily available for inspection by the JCAHO at any time. The QA officer of the division of diagnostic imaging is the QA officer for the departments of both nuclear medicine and diagnostic radiology. He attends all the QA meetings and compiles all the QA records for the division. He acts as our liaison with the institutional QA office. The department also sends a physician representative to the institutional QA committee. He represents us and contributes to the overall institutional December
1990
instead,
ment;
Summary
of the QC Program
have
Category Imaging
QC Field uniformity, sensitivity,
instrumentation
image processor
cathode consistency
ray
tube
gain,
Daily
Linearity and resolution, uniformity, magnification, energy resolution Collimator integrity, counter calibration, scanning resolution Receiving log, radiochemical purity, dispensing record, dose assays Sterility test, radionuclide purity, wipe testing
Radiopharmacy
Update
procedure
Pyrogenicity
Nonimaging
test,
sulfur colloid size Accuracy and consistency (check by NBS standards), self tests of CRC-30 and
instruments
Geiger-Muller Bactec performance, calibration Dose calibrators Dose calibrators
Monthly
survey
Daily
motion,
Weekly
the and
division share
of diagnostic a computerized
Random
we
decided
to institute
Daily
committees.
Annual of albumin
and
of dose calibrators test buttons and Accucal-2022,
and,
well-counter
linearity accuracy
Weekly
first
Quarterly Annual
cians which
QA
for single
program,
photon
as well
line of communication department and the
ensures
dure
that
manuals
as required
all
are by
ed protocols
as being
a direct the QA
QA
committee
policy
and
regularly
the
computed
between institutional
committee. The departmental also
emission
Updat-
THE
QA
It is
physicians. It is not a static program. The broad outline of this program may be implemented by any other
However, our that maximum
obtained
from
programs that fied or replaced Certain generic their
utility
specific
experibenefit QA
are periodically modiby new programs. monitors also lose
as the
imaging
protocols
are modified and computers are introduced into patient scheduling. Whatever QA program a department implements, it must satisfy the five steps incorporated in the definition
Volume
177
Number
#{149}
THE
3
convert
simple
and
early
comprehen-
MATURITY OF PROGRAM
Three evolved
THE
basic components in the QA program
both
part
QA
some
is the
that
joint that 1. The
Nuclear
Physi-
has
been
QA
expanded
to a full is still
limited to the physicians working the nuclear medicine department, and its activities have already been discussed. The diagnostic radiology
belongs imaging. cerned
has
a separate
to the division This committee with QC matters
in
physiis the Inwhich
of diagnostic is conthat are both
routine and problematic for nuclear medicine as well as for diagnostic radiology. The divisional QC program is now computerized, and the activities are discussed at the biweekly meetings. The membership is made up of the chief technologists, the
and code cards, and (c) QA program related to patient care problems as also and
of
imaging data base,
cians’ QA program. 2. The second committee strumentation QC Committee,
have de-
identified in the QCRFs and from the conduct of generic ical monitors. The departmental
of
Conduct of Care Committee, is basically a physicians’
department
scribed herein: (a) QC of instrumentation and radiopharrnaceuticals, (b) physician QA by double reading
maintenance representatives,
dinQA
engineers and service the charge technolo-
committee is responsible for all three aspects. To cope, it must either delegate some of the responsibilities or
gists in each section and technical rectors of the departments of diagnostic radiology and nuclear medi-
institute
cine, the operations manager, and the QA officer of the division, as well as the deputy chairmen of diagnostic
subcommittees,
for the
as we
physicians’
As the gram
about the impact that the difQA programs may have on the
be
may
are
from a subcommittee level committee. Its membership
in the Introduction. The and dedication of all
QA
the
became
evolved
QC part
more
in our
of the
pro-
and
less
routine
demanding of the committee’s because more could be delegated the appropriate staff; however, physicians’ to be more
have
QA program.
program
department,
We find that the program described herein is very satisfying. still evolving as we are learning
may
concerned
at all times
PRACTICALITY OF PROGRAM
department. ence indicates
of QA given cooperation
is not included.
proce-
to any member of the department (specifically, protocols for clinical studies performed in the department). Procedure manuals for the technologists and a separate procedure manual for the radiopharmacists are updated approximately every 2 years unless a specific need arises. “Standing orders” of the department, which go to the outpatient clinics and inpatient wards, are also regularly reviewed by the QA cornmittee and updated as needed.
more ferent
(SPECT)
efforts to a mature sive QA program.
updated
institution.
are available
tomography
committees
for each
new committees are as follows:
committee
program Note.-QC
as we
The three have evolved
meters
Ludlum
new
one
other departments. The department of diagnostic radiology has adopted some of the QA activities we set in
Weekly
manual macroaggregates
three
formed,
the aspects of the QA program. Also, as the nuclear medicine QA program matured, it provided examples to
Frequency
Test
been
QA program comprehensive
has
radiology and their physician
time, to the
3. The
lems
arising
from
from
the
monitors-has
important the overall has
been
these
not
but has program established
problems
fashion.
QCRF
conduct
evolved and has
in a more
these gram,
three aspects it was decided
main
QA
committee,
representatives gineering,
less
found a niche in in that a routine to deal
As a result
committee
reporting any
of the
of the
nurses room butions
with
organized maturity
of the QA proto dissolve the depart-
is the
Pa-
the nuclear medicine and the diagnostic radiology departments. The members of the patient care committee are similar to those of the QC
of generic become
committee
or
tient Care QA Committee, which is also a committee of the whole division of diagnostic imaging and coyers patient care problems from both
demanded more time and energy of the physicians. The third aspect of the QA program-patient care proband
third
nuclear medicine representatives.
di-
of
but,
instead
from it includes
of having
service and the charge
en-
and the supervisors of the file and transcription. Their contrisupplement those from other
members administrators
such
as technologists to provide
and an
effective
QA program for improving patient care. This committee discusses QCRF Radiology
.
613
problems and conducts periodic monitors, as well as discusses and deals with other problems related to patient traffic, scheduling, and preparations. Also, the tracking of patient radiation dose may become more important as therapeutic uses of radiopharmaceuticals increase. Problems discussed include appropriate scheduling of patients who are to undergo bone scanning and CT on the same day. Typically, a member of the committee will be appointed to go to the appropriate outpatient clinic or inpatient ward to point out to the staff there the appropriate way to prepare and schedule patients.
activities program,
mented
form and
by adequate
evaluation priateness
5.
and
and approand thera-
6.
7.
8.
9.
References 1.
Joint
Commission
on Accreditation
of
Healthcare Organizations. Accreditation manual for hospitals. Chicago: Joint Commission on Accreditation of Healthcare 2.
3.
4.
#{149} Radiology
monitoring
of the quality of diagnostic
Organizations
614
part of the be supple-
peutic nuclear medicine procedures. In particular, the need to monitor study requests for appropriateness has been stressed in a recent JCAHO inspection. Documentation of all QA activities is an important component of any QA program designed to achieve the highest standard of patient care. I
CONCLUSIONS A good QA program is essential for delivery of appropriate care to our patients. QC is important (11) but forms only a small part of a comprehensive QA program. Physician QA
a major this must
Nuclear
1990;
119-123.
Regulatory Commission 10 CFR Part 35 proposed rules. Federal Register. January 16, 1990; 55:1439-1449. Marcus CS. NRC workshop participants discussed the part 35 petition and the QA rule news. J NucI Med 1989; 30:1584-1585. Stanley CL, Stanley SJ. Nuclear medicine quality assurance program. Marion, Ill: Educational Research, 1989.
10.
11.
Williams
ED,
Harding
LK,
McKillop
JH.
Checklists for quality assurance and audit in nuclear medicine. Nucl Med Comm 1989; 10:595-599. Williams ED, Harding LK, McKillop JH. A postal survey of quality assurance in nuclear medicine imaging in the UK during 1988. J Nucl Med 1989; 10:181-189. Coakley AJ. Quality assurance in nuclear medicine editorials. Nucl Med Comm 1989; 10:139-140. Croft BY. Routine quality control. In: Croft BY. ed. Single-photon emission computed tomography. Chicago: Year Book Medical, 1986; 228-234. O’Leary MR, Smith MS. O’Leary DS, Olmsted WW, Curtis DJ, Groleau G, Mabey B. Application of clinical indicators in the emergency department. JAMA 1989; 262:3444-3447. Friedman BI. Quality assurance and nuclear medicine the challenge of change. Nucl Med 1986; 27:1366-1372. Souchkevitch GN, Asikainen M, Bauml A, et al. The World Health Organization and International Atomic Energy Agency second interlaboratory comparison study in 16 countries on quality performance of nuclear medicine imaging devices. Eur NucI Med 1988; 13:495-501.
December
1990