Administration in Mental Health Vol. 5, No. 1 Fall/Winter 1977

COMMUNITY MENTAL HEALTH ACCREDITATION: A Pilot Study Richard E. Dorgan, Ronald J. Gerhard, and Elizabeth D. Kennard

A B S T R A C T : T h e Balanced Services System is the conceptual framework for the newlv initiated community mental health accreditation p r o g r a m sponsored by the Joint Commission on Accreditation o f Hospitals (]CAH). T h e p r o g r a m design and p e r f o r m a n c e of CMH systems are reviewed and j u d g e d according to a series of evaluation criteria that prescribe the desired operating stare for each functional area in the center.

In February of this year the Community Mental Health Accreditation Program of the Joint Commission on Accreditation of Hospitals began its pilot study of community mental health service programs. Underlying the development of the chosen evaluation approach is a conceptual framework the purpose of which is to enhance the objectivity of the accreditation decision. Additionally, specific procedures followed in the collection of pre- and on-site survey data and their analysis has been prescribed to assure the objectivity of the accreditation process. A review of the literature reveals that there is no agreed-upon definition of quality for mental health care. Frequently this occurs because the authors have different assumptions and values and often define quality from the perspective of their professions. Usually quality is defined in the terms of those who are doing the review, such as members of a Professional Standards Review Organization (PSRO) team, or a profession, such as peer review by physicians, or in terms of the purpose of the review, such Richard E. Dorgan is a human service consultant to several health, education, and welfare programs and numerous organizations including the Joint Commission on Accreditation of Hospitals. He is a codesigner of the Balanced Service System. Ronald J. Gerhard is the Director, Office of Planning and Budget, Georgia Department of Human Resources, Atlanta, Georgia. He is a codesigner of the Balanced Service System. Elizabeth D. Kennard, Ph.D., has extensive experience and interest in the use of computers in health care applications. She was involved in the original computerized simulation model of the Balanced Service System and is presently an independent consultant. The work of this project was supported in part by NIMH Contracts ADM-42-74-22(OP) and 278-76-0024(OP). The authors would like to express their appreciation for the contributions made by Barbara W. Smith, Jane L. Gaines and Cassandra Lindsey.

4

Administration in Mental Health

as utilization review. In other instances it becomes defined in terms of the time of the review, such as c o n c u r r e n t and retrospective audits. Furthermore, most of the literature only deals with the delivery of services and usually f r o m the exclusive perspective of providers. However, the persons doing the review, the mechanism used to evaluate an event, or the times of the review do not define quality.

It is possible to define quality in a manner that can apply to all the functions performed by a system. Sorting these dimensions into their p r o p e r perspective makes it possibie to define quality in a m a n n e r that can apply to all the functions p e r f o r m e d by a system. These functions have many characteristics, some of which are valued. It is these valued characteristics that define quality. T h e r e f o r e , the quality of any function is d e t e r m i n i n g the presence or absence of the valued characteristic. Once the presence of the valued characteristic has been detected it can then be measured to determine the extent of its presence. Because some of the characteristics are valued the critical issue to resolve is to determine whose values are going to be used to represent quality. Since the system has three constituent g r o u p s - consumers, providers, and social institutions--there are three perspectives to consider, each having a unique and essential contribution. Therelore, the three constituent groups need to be integrated into an overall framework for d e t e r m i n i n g quality.

DEVELOPING EVALUATIVE PRINCIPLES In the Balanced Service System, integrating the perspectives of these three constituent groups has been accomplished by first identifying nine criteria or general characteristics that may be applied to any h u m a n service system:

Design Criteria 1. 2. 3. 4.

Identification: the ability to identify p r o g r a m recipients on the basis of specialized needs. Intention: the ability to design programs that meet a variety of needs. Demarcation: the ability to create boundaries that establish program relationships. Orientation: the ability to d e t e r m i n e the time aspects of programs.

Richard E. Dorgan, Ronald J. Gerhard, and Elizabeth D. Kennard

5

Performance Criteria 1. 2. 3.

Acceptability: the ability to respond adequately to the goals and values of constituents. Accessibility: the ability to establish productive relationships between constituents. Synergy: the ability of system elements to work together cooperatively.

Review Criteria 1. 2.

Effectiveness: the ability to achieve stated objectives. Efficiency: the ability to minimize expenditures while achieving stated objectives.

These nine system criteria are then defined from the perspectives of the three constituent groups, yielding the following 19 operating values that serve as the cornerstone of the Balanced Service System. (The design criteria, in becoming design operating values, remain the same, since the needs of all three constituent groups are met by the same design.) While constituents have c o m m o n interests in the system, these values, as indicated by the parentheses, reflect those central concerns specific to each constituent group.

Design Operating Values 1. 2. 3. 4.

Identification (all constituents): the ability to identify program recipients on the basis of specialized needs. Intention (all constituents): the ability to design programs that meet a variety of needs. Demarcation (all constituents): the ability to create boundaries that establish program relationships. Orientation (all constituents): the ability to determine the temporal aspects of programs.

Performance Operating Values 1.

2.

Sensitivity (acceptability/consumers): the ability to operate a system that has the support of the communities it serves and responds adequately to its users. Permeability (accessibility/consumers): the ability to operate a system that has no inappropriate barriers to entrance and allows use at the time and place required.

6

Administration in Mental Health

3.

4. 5. 6. 7. 8. 9.

Continuity (synergy/consumers): the ability to maintain a common relationship with an individual throughout an uninterrupted sequence of services. Flexibility (acceptability/providers): the ability to adapt system boundaries and to respond to change. Availability (accessibility/providers): the presence of a sufficient volume of activity to meet stated objectives. Coordination (synergy/providers): the ability to perform a common action in a systematic manner. Accountability (acceptability/social institutions): the ability to report and justify proposed and ongoing activities. Comprehensiveness (accessibility/social institutions): the degree to which stated goals are congruent with the mission of the system. Integration (synergy/social institutions): the ability to merge all necessary elements into a unified service system.

Review Operating Values 1. 2. 3. 4. 5. 6.

Objective--effectiveness (consumers): the ability to achieve stated objectives. Function---effectiveness (providers): the ability to perform stated functions. Resources---effectiveness (social institutions): the ability to utilize stated resources. Objective--efficiency (consumers): the ability to achieve objectives with the minimal resources necessary. Function---efficiency (providers): the ability to achieve objectives with the minimal effort necessary. Resources--efficiency (social institutions): the ability to perform functions with the minimal resources necessary.

These 19 operating values represent an integrated set of values because they are based on the perspectives of the three constituent groups. Since the functions of the system are clustered into five groups called functional areas--service, administration, citizen participation, research and evaluation, and staff development---each of these 19 operating values is then defined in terms of each of the functional areas. This generates 95 evaluative principles. The development of the system's operating values and evaluative principles is depicted in two stages as seen in Figure 1. Table 1 lists the 19 operating values and their respective evaluative principles by functional areas. Operating values such as "availability" and "accountability" generate principles across each functional area. This makes it possible to respond

Richard E. Dorgan, Ronaid J. Gerhard, and Elizabeth D. Kennard

Stage I Design Criteria (4) Design Pergpective (1)

Stage 2

Design Values (4) J

Performance Criteria (3)

Operating Values (19)

Performance Values (9)

Constituent Perspectives (3) Review Criteria (2) Constituent Perspectives (3)

7

>

Functional _ A r e a s (5)

EvaluatNe Principles (95)

Review Values (6)

F1GURE 1

to issues and questions, for example, about accountability, since there is a principle dealing with this concept in each functional area. The principles generated are part of an evaluative format that also includes subprinciples, indicators, sources, and standards. Principles and subprinciples together are fundamental concepts and are comparable to ùstandards" published in previous JCAH accreditation manuals. Whether explicit or implicit, every idea contained in a principle generally requires the development of a related subprinciple. Similarly, for every concept contained in a subprinciple, there must be a related indicator, and for Table 1 Values

Service

Administration

Citizen Participalion

Research and Evaluation

Staff Development m

o

1 2 3 4

Identification Intention Demarcation Orientation

Definition Alm Delineation Coverage

Specification Scheine Distinguish Staging

1 2 3 4 5 6 7 8 9

Sensitivity Permeability Continuity Flexibility Availability Coordination Accountability Comprehensiveness Integration

Satisfaction Penet ration Advocacy

Concurrence Explicative Extensive Transition

Delegation Structure Unitization Communication Sufficiency Collaboration Documentation Congruence Amalgamation

Service Objeclive Effectiveness Service Function Effectiveness Service Resources Effectiveness Service Objective Efficiency Service Function Efficiency Service Resources Efficiency

Admin. Objective E(fectiveness Admin. Funclion Effectiveness Admin. Resources Effectiveness Admin. Objective Efficiency Admin. Function Efficiency Admin. Resources Efficiency

1 Objective Effectiveness 2 Function Effectiveness 3 Resources Effectiveness 4 Objective Efficiency 5 Function Efficiency 6 Resources Efficiency

Responsiveness Adequacy

Recognition Preferences Associations Timing

Establish Determination Scope Limits

Sanction Hearing Tenure Adaptive Reflective Facilitate Informative I Representative Coherence

Description Incorporation Commonality Comparative Discernment Standardization Justification Exhaustive Unification

Compatibility Elimination Development Desi8n Differentiation Augmentation Qualifications Variety Cooperation

R. & E. Objective Effectiveness R. & E. Function Effectiveness R. & E.'Resources Effectiveness R. & E. Objective Efficiency R, & E. Function Efficiency R. & E. Resources Efficiency

S.D. Objeclive Effectiveness S.D, Function Effectiveness S.D. Resour¢es Effectiveness S.D. Objective Efficiency S.D. Function Efficiency S.D. Resources Efficiency

C.P. Objective l:ffectiveness C.P. Function Effectiveness C. P. Resources Effectiveness C.P. Objective Efficiency C.P. Function Efficiency C.P. Resources Efficiency

[

Locale Effect Affiliations Schedulin 8

8

Administration in Mental Health

every idea specified in an indicator there must be an associated standard. Thus, if a principle has five ideas, there should be five subprinciples and so on. For each subprinciple, there are indicators, which are units of performance used to measure compliance to a principle. The source indicates where information required by each indicator is found. Each indicator has one or more standards indicating the required level of performance. Thus, the indicators are the characteristics of quality and the standards are the required level of quality. This evaluative structure results in greater specificity at each level, with the principle being the most general and the standard the most specific. These are the reasons why a principle and its related subprinciples, indicators, and standards are presented together--they are interdependent on each other. This format highlights why the measurement activity of the evaluation function begins with the specification ofvalues--the essence of evaluation. In effect, evaluation by definition requires the specification and the application of values in a measurable format.

Evaluation by definition requires the specification and application of values in a measurable format. The principles are formatted around three stages: design, performance, and review. Essentially, the design stage is concerned with planning, the performance stage focuses on the implementation of plans, and the review stage deals with effectiveness and efficiency in terms of the stated objectives of the functions and the resources used. (The review stage materials have not been included in the initial JCAH publication, although it is possible they will be included in future revisions.) Figure 2 explains the numbering system used to make the foregoing distinctions. The 65 design and performance principles are expanded into subprinciples, indicators, and standards in the Quality Assurances section of the Principlesfor Accreditation of Community Mental Health Service Programs as illustrated in Figure 3 taken from pages 93 and 94 of the performance stage of the Service functional area. 1

2

3

.1

.1

.1

Standard Indicator Subprinciple Principle Stage I:unctional Area FIGURE 2

Richard E. Dorgan, Ronald J. Gerhard, and Elizabeth D. Kennard

127

Principle (Explieative):

127.2

Subprin~iple: To f a c i l i t a t e service c o n t i n u i t y . service records shall be comglete.

127.2.1

127.2.1.1 127.2.2

127.2.2.1 127.2.3

~27.2.3.1 127.2.4

127.2.4. I 127.2.5

Adequate service records (Principle of Quality ) sha11 be maintained and consumers shall be kept i n f0rmed as to t h e i r status and progress.

Indicator: % o f r e f e r r e d / t r a n s f e r r e d records containing reasons f o r r e f e r r a l / t r a n s f e r So~oe: Service records 100%

Standard:

% o f r e f e r r e d / t r a n s f e r r e d records containing h i s t o r y of past problems Sour~e: Service records 100%

Indicator:

% o f r e f e r r e d / t r a n s f e r r e d records containing past service plans and a c t i v i t i e s Sou~oe: Service records

Standard:

lO0~

Indicator:

% o f records containing an i d e n t i f i c a t l o n form that includes name and case number, address, phone number, sex, a9e, date o f b i r t h , marital status, n e x t - o f - k l n , school and/or employment status, t h i r d party coverage, entry p o i n t , date o f i n i t i a l contact and/or admission to s e r v i c e , legal status, and disabllity Soße: Service records Standard:

Indioator:

lO0~ % o f records containing assessment

at each decision point 127.2.5.1 127.2.6

B~mdo.rd:

Indioator:

127.2.6.1 127.2.7

Standard:

Indicator: Standard:

Service Records

% o f records containing specified So~rce:

Service records

100~

~ o f records containing service

a c t l v l t y notes 127.2.7.1

Source:

I00~

service plans

Bouroe:

(Characteristic of Quality) (Location of Documentation) (Required Level of Quality)

Indicator:

Standa2d:

(Subprinciple of Quali ty)

Service Records

100%

127.2.8

Indicator:

127.2.8.1

standard:

127.2.8.2

Standard:

127.2.8.3

Standard: 100% o f records o f consumers in remotivational and sustalning care services with progress notes recorded at least monthly

127.2.8.4

Standard: 100% o f records o f consumers in growth and sustenance services in the supportive and natural environments with progress notes recorded at least at the midpoint o f the average length o f stay o f the program

% or records containing p e r l o d i c progress notes Souroe: Service records and u t i l i z a t i o n record

100% o f records o f consumers in a l l c r i s i s s t a b i l l z a t i o n r e s i d e n t i a l services with progress notes recorded d a i l y

100~ o f records o f consumers in c r i s i s support and c r l s l s i n t e r v e n t i o n services w i t h progress notes recorded at each contact

FIGURE 3

9

10

Administration in Mental Health

PROCEDURES To evaluate community mental health service programs consistent with the conceptual model and the above evaluative format, both a presurvey and site survey questionnaires are being developed. T h e purpose of the presurvey questionnaire is to obtain specific face sheet information about the organization such as name, address, name of executive director, urban-rural setting, region of the country, freestanding or affiliate, age and disability groups served, size of case load, and functions and activities conducted. T h e second purpose of the presurvey is to collect baseline data for the site survey such as the hours of service, average lengths of stay, utilization rates, provider consumer ratios, consumer/family, staff/provider, agency responses, and the d e n o m i n a t o r figures for all of the indicators and standards that are expressed as percentages or actual numbers. The d e n o m i n a t o r figures determine the sizes of the samples to be reviewed d u r i n g the site survey and provide the base data for inferring, from the findings in the site survey, the population that is in conformance with the standards. W h e n the presurvey questionnaire is completed, it will consist of four documents: baseline data, consumer/family survey (to be written in English and Spanish), provider/staff survey, and agency survey. T h e entire presurvey questionnaire is to be completed by the organization prior to the actual site survey and returned to J C A H for analysis. It is anticipated that this presurvey information will be used to assess eligibility and determine the size of samples to be reviewed during the site survey. T h e sample size figures will be printed next to each item on the site survey documents to indicate the n u m b e r to be reviewed by the surveyor. This presurvey information will also enable the production of a custom made computerized survey package made up of only those items applicable to each applicant organization. T h e site survey questionnaire consists of 40 source documents, all containing indicators of the noted source in the Quality Assurances section of the Principles manual. T h e purpose of the site survey is to verify the data collected in the presurvey and collect observational data. On those items where a sample of data is required by design, such as in the service records, the indicator score will be the ratios of the n u m b e r surveyed and found to be in compliance with the total n u m b e r surveyed for the specific indicator. For example, if the presurvey indicates that there were 100 cases transferred d u r i n g a year preceding the survey this would mean that a 10-percent sample size would require 10 cases to be reviewed for any indicator dealing with transferred cases (127.2.1.-127.2.3.) If during a review of a specific indicator a surveyor found 5 of the 10 cases in conformance the organization would receive a fifty percent achievement ratio for the item. Thus if the community mental health committee of the

Richard E. Dorgan, Ronald J. Gerhard, and Elizabeth D. Kennard

il

FINDI~S

CRITERIA NUMBER OF REFERRED/TRANSFERRED CONSUMERS WHOSE RECORDS CONTAIN REASONS FOR REFEP,RAL/TRANSFER.

127.2.1.1

NUMBER OF REFERRED/TRANSFERRED CONSUMERS W'HOSE RECORDS CONTAIN HISTORY OF PAST PROBLEMS.

127.2.2.1

NUMBER OF REFERRED/TRANSFERRED CONSUMERS WHOSE RECORDS CONTAIN PAST SERVICE PLANS AND ACTIVlTIES.

127.2.3.1

NUM3ER OF RECORDS CONTAINING ASSESSMENTAT POINT.

127.2.5.1

FACH DECISION

Note: Decision points inolude admlssions, transfers, orises, mld-point of average length of stay of service, stays beyond one and one-half tlmes the average length of stay, every six months, and diseharges. Examine aetlvlty and progress notes to aseertaln if plannlngllnklng conferences occurred at required declsion polnts.

I

127.2.7.1

NUMBER OF RECORDS CONTAINING SERVICE ACTIVITY NOTES. Note: An activity note records the events and transactions that occur between eonsumer(s) and provider(s). Thls is in contrast to a progress note whieh contaßns an assessment of the consumerVs current role performance and health status.

NUMBER OF CONSUMERS IN ALL CRISIS STABILIZATION RESIDENTIAL SERVICES (Crlsls Care, Temporary Resldenee, Temporary Sponsorshlp) WITH PROGRESS NOTES RECORDED DAILY.

I

I

I

127.2.8.1

iNote: A prosress note contains an assessment of the current role performance or outcome of service and health status of the indlvldual. In addltlon, it contalns the consumerls response to the service or progress stat8ments.

FIGURE 4

Joint Commission establishes the compliance level at 50 percent or below the achievement level would meet the level required for a favorable accreditation decision on this item (indicator). Figure 4, taken from pages 16-17 o f the service records source document of the site survey questionnaire, presents the structural relationship between the indicators and the survey instrument. Note that these examples correspond with principle 127 contained as an illustration of the evaluative format contained on page 9 of this article.

PROCEDURES CONSTRUCTION In constructing the site survey procedures, many changes have been made. First, several sources identified in the Quality Assurances section

12

Administration in Mental Health

have been collapsed into m o r e generic sources to reduce the n u m b e r of source documents. Second, many of the indicators that were measured in one source were d r o p p e d from other sources to avoid unnecessary duplication. It became evident that repeating indicators in different sources for purposes of verifying internal consistency was impractical in terms of the additional time and costs involved in completing an actual survey with

Repeating indicators in different sources to verify internal consistency is impractical in terms of time and cost. this type of design. Third, several alterations are currently being made as a result of field testing the instrument: 1.

2.

In many instances where an indicator has m o r e than one dimension to be measured, the structure of the questionnaire is being changed to permit the scoring of each aspect. For example, indicator 211.0.1 states " N u m b e r of administrative programs whose long range and annual needs are specified." T h e related structure only allows for the recording o f programs that specify both longand short-range needs. Thus, no credit can be given if the administrative p r o g r a m s specify either, but not both, their long or short range needs. Consequently, the structure for recording the responses is being changed to allow for each dimension to be rec o r d e d separately. In several cases where a score is either Yes or No and several but not all aspects of the organization meet the requirement, a No answer must be scored. For example, indicator 121.34 states "Protective e n v i r o n m e n t units have public telephones." If 1 out of 10 protective units is without a p h o n e a "No" must be recorded. T h e r e t b r e in o r d e r to give partial credit, the scoring is being changed to the "0 1 2 3 4" format to permit partial credits. In those cases where a score is less than 4, in this example the one unit without a phone, the surveyor is compelled to explain the exception in the space provided.

It is anticipated that additional field testing will gather f u r t h e r information that will be used to refine the procedures. Many of the new changes will ultimately require revisions in the Quality Assurances section of the Principles manual. However, these changes, in general, will not alter the standards, only the way of scoring the desired characteristic of quality and the n u m b e r of source d o c u m e n t s to be employed in the site survey.

Richard E. Dorgan, Ronald J. Gerhard, and Elizabeth D. Kennard

13

STATISTICAL ANALYSlS After all of the organizations participating in the pilot study have been surveyed, the data will then be statistically analyzed to ascertain the current levels achieved. This data will also be stratified by other variables such as size, location, functions, and years in operation to determine if these factors influence achievement levels. In addition, an item analysis will be done to d e t e r m i n e if there are one or more indicators that are good predictors of overall achievement. This information will enable decisions to be m a d e concerning which indicators may be pooled, expanded, or deleted for future streamlining of the surveys. Once the levels of achievement have been ascertained t h r o u g h the pilot study, the CMH Committee will then d e t e r m i n e the required levels of compliance. It is anticipated that an accreditation decision will be made on the basis of (1) the overall scores by principles and (2) the n u m b e r of items in compliance.

CONCLUDING STATEMENT T h u s accreditation emerges as a match between values and what the Joint Commission defines as quality. To the extent that there is a substantial match between a service program's operations in terms of desired characteristics and levels of quality (the assurances of acceptable levels of quality t h r o u g h onsite evaluation) is the likelihood that a favorable accreditation decision will be r e n d e r e d .

Community mental health accreditation: a pilot study.

Administration in Mental Health Vol. 5, No. 1 Fall/Winter 1977 COMMUNITY MENTAL HEALTH ACCREDITATION: A Pilot Study Richard E. Dorgan, Ronald J. Gerh...
604KB Sizes 0 Downloads 0 Views