Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/12856659

A measuring instrument for evaluation of quality systems

Article  in  International Journal for Quality in Health Care · May 1999


DOI: 10.1093/intqhc/11.2.119 · Source: PubMed

CITATIONS READS

55 2,808

3 authors, including:

Cordula Wagner Peter Groenewegen


Amsterdam University Medical Center Nivel – Research for better care
425 PUBLICATIONS   7,087 CITATIONS    678 PUBLICATIONS   17,873 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

The First Dutch National Survey of General Practice View project

Impact of Quality Management Systems View project

All content following this page was uploaded by Cordula Wagner on 01 June 2014.

The user has requested enhancement of the downloaded file.


International Journal for Quality in Health Care 1999; Volume 11, Number 2: pp. 119–130

A measuring instrument for evaluation of


quality systems
CORDULA WAGNER, DINNY H. DE BAKKER AND PETER P. GROENEWEGEN
NIVEL, Netherlands Institute of Primary Health Care, Utrecht, The Netherlands

Abstract
Objective. To develop an instrument for provider organizations, consumers, purchasers, and policy makers to measure and
compare the development of quality systems in provider organizations.
Design. Cross-sectional study of provider organizations using a structured questionnaire to survey managers.
Setting. The Netherlands.
Study participants. Provider organizations of six health care fields: primary health care, care for the disabled, mental health
care, care for the elderly, hospital care and welfare care.
Main measures. Existence of quality assurance and quality improvement activities.
Results. The study presents a survey instrument for assessing the quality assurance and improvement activities of health
care provider organizations and the developmental stage of quality systems. The survey instrument distinguishes five focal
areas for quality improvement activities and four developmental stages. The study also reports data on the reliability and
validity of the survey instrument.
Conclusion. The instrument is reliable, easy to administer, and useful across health care fields as well as different kinds of
organizations. Developing quality systems provide a common language across all parts of the health care sector. By assigning
the activities to focal areas and developmental stages the instrument gives insight into the implementation of quality systems
in health care. Comparable information on quality assurance activities increases the accountability of providers. Because of
the efficient (not time consuming) approach, the instrument complements existing accreditation reviews.
Keywords: implementation, instrument, provider organizations, quality systems

Since the time that the question changed from whether quality hand, provider organizations can also compare themselves
can be measured to how best to measure quality, interest has with other organizations and can show patients and purchasers
been focused upon the selection of measurement sets which what improvements have been made in the service delivery
reliably and credibly inform about quality of health care process.
service [1]. The complexities of health care demand a balance Although in the USA, states have an array of regulations
between structure, process and outcome measures in quality designed to strengthen the position of patients, questions
monitoring. Quality systems that influence the structure and arise as to whether the states or the Federal Government
processes in provider organizations are one approach used have the resources to monitor and properly enforce the
to avoid poor quality. Advocates of quality systems suggest regulations. In a study done by the USA General Accounting
that they have significant potential to enable provider or- Office on state oversight, the conclusion was that states
ganizations to improve quality without increasing costs. needed to institute a set of safeguards to protect consumers,
In this study we define a quality system as the organizational including better quality assurance mechanisms [2]. Therefore,
structure, procedures, processes and activities that are mu- an efficient and routine examination of the organizations’
tually dependent and directed at the improvement of health arrangements to control and assure the quality of care is
care services. By measuring the developmental stage of a required. Different organizational audit frameworks exist that
quality system, purchasers, consumers and regulators can highlight areas of an organization that experts believe to be
more easily compare provider organizations. On the other essential to the organization’s ability to provide consistently

Address correspondence to Cordula Wagner, Netherlands Institute of Primary Health Care, Utrecht. P.O. Box 1568, 3500
BN Utrecht, The Netherlands. Tel: +31 30 2729700. Fax: +31 30 2729729. E-mail: c.wagner@nivel.nl

 1999 International Society for Quality in Health Care and Oxford University Press 119
C. Wagner et al.

good quality of care [3–6]. Examples are the European ISO Survey instrument
9000 standards, the Malcolm Baldrige USA National Quality
The questionnaire (Appendix) was developed by the re-
Award, the UK Kings Fund Accreditation, the European
searchers in co-operation with experts on quality im-
Quality Award (EFQM) and the Dutch Quality Award.
provement from different health care fields, and partly derived
Moreover, organization theory describes developmental
from the Dutch Quality Award, which is a translation of the
stages that organizations follow during the implementation
European Quality Award [10] (Table 1). The Dutch Quality
of innovations. The four stages most frequently distinguished
Award distinguishes five organization focal areas, the ‘en-
are: (i) orientation and awareness that change is necessary;
ablers’, and five developmental stages leading to total quality
(ii) planning and preparation for change; (iii) implementation
management. The enablers are the focal area leadership,
of projects; and (iv) organization-wide implementation and
policy and strategy, people management, resources and pro-
establishment of the innovation [7–10].
cesses. For the survey instrument we operationalized the
The rationale for developing an instrument to measure
focal area policy and strategy, people management and pro-
focal areas and the developmental stage of quality systems
cesses. The focal area leadership was only operationalized in
was provided by the need to obtain information on how
relation to people management. Leadership aspects con-
provider organizations assure the quality of care, and how
cerning the attitude of management with regard to quality
many have actually developed a quality system. Until now
quality systems have been evaluated by voluntary accreditation improvement were not operationalized, because of the risk
processes; in general such information is not available for of socially desirable answers. From the focal area resources
research. These evaluations are very time-consuming and for we developed only questions about information policy. In
that reason not suitable for gathering comparable data from addition, we asked for health care-specific activities such as
many provider organizations. patient participation.
In the literature only a few studies have been found The questionnaire used a closed, Likert-type format with
that assess, on a wider scale, the development of quality three or four ordinally scaled options per question and
management against a set of criteria [11–15]. All studies have some nominally scaled questions. In the questionnaire the
taken place in a hospital setting with different questionnaires. management was asked about concrete activities such as the
The literature as yet describes no sustained approach to assess development of quality documents and the use of standards.
the developmental stages of quality systems in health care
across sectors of care. The purpose of this study is to assess Analysis
the internal consistency, reliability and construct validity of a The data were analysed in several steps. Firstly, the validity
survey instrument measuring the developmental stage of of the instrument was tested in a separate study [16]. Secondly,
quality systems in provider organizations. The instrument the number of variables was reduced by exploratory factor
presented in this article has been used to measure the analysis and Simultaneous Component Analysis (SCA), a
developmental stage of quality systems across health care multi-group confirmatory factor analysis that summarizes the
sectors in a nationwide inquiry. Premises in the inquiry were variables for the different health care sectors (subsamples).
the management perspective and a total quality management Thirdly, the reliability of the different scales, subscales and
approach. subgroups were assessed by calculating Cronbach’s a. Then,
we determined the developmental stages by dividing the
activities, in co-operation with experts on quality im-
Methods provement, into the four stages distinguished earlier: ori-
entation and awareness, preparation, experimentation and
Study area integration into normal business operations (establishment).
Data used in the analyses were survey-data collected in a An organization has reached a developmental stage if it has
large nationwide study within different health care sectors and developed at least one of the quality improvement activities
health care-related social service sectors in The Netherlands. for that stage and most of the activities of the earlier stages.
Almost all provider organizations are registered as members Finally, we calculated the percentage of organizations that
of one national umbrella organization; all members of this developed in accordance with the defined developmental
organization were included in the study and received a postal stages.
questionnaire. Only for the organizations for the elderly did
we take a random sample (10% of the homes for the elderly
and 50% of the nursing homes). A total of 1594 provider Results
organizations were approached; 315 organizations of primary
health care, 372 organizations for disabled people, 248 mental Response
health care organizations, 316 organizations for the elderly,
143 hospitals and 200 organizations of health care-related Of the 1594 organizations, 1182 (74%) submitted data and
social services. The questionnaire was sent to the management completed the questionnaire. An overview of the response
of the organization; the professionals were not involved in is given in Table 2. The response percentage differs across
the study. Therefore, the data show the perspective of the sectors from 55% of the homes for the elderly to 91% of
management. The instrument was part of a larger survey. the organizations for sheltered living.

120
Evaluation of quality systems

Table 1 Differences and similarities of focal areas in quality audit frameworks

European Quality Award1


1
USA National Quality Award Dutch Quality Award2 UK Kings Fund Accreditation1
.............................................................................................................................................................................................................................
Enablers Areas assessed
Leadership Leadership Management and support services
Information and analysis Policy and strategy Professional management
Strategic quality planning People management Departmental management
Human resource utilization Resources Each area assessed for:
Quality assurance of products and Processes 1. Philosophy and objectives
services 2. Management and staffing
3. Staff development and education
4. Policies and procedures
Results 5. Facilities and equipment
Quality results People satisfaction 6. Evaluation and quality assurance
Customer satisfaction Customer satisfaction
Impact on society
Business results
1
Based on: Øvretveit J. A comparison of approaches to health service quality in the UK, USA & Sweden and of the use of organizational
audit frameworks. Eur J Publ Health 1994; 4: 46–54.
2
Unlike the European Quality Award the Dutch Quality Award distinguishes five developmental stages.

Table 2 Overview of participating health care sectors and organizations

Sector Organizations n Response (%)


.............................................................................................................................................................................................................................
Primary health care Integrated health centres 115 76
Home care organizations 140 81
Public health care organizations 60 75
Care for the disabled Day care for the mentally handicapped 135 75
Day care for the physical handicapped 109 89
Institutions for the disabled 128 68
Mental health care Mental health care organizations 98 73
Organizations for sheltered living 45 91
environment
Ambulatory mental health care organizations 57 84
Drugs-rehabilitation centres 48 62
Care for the elderly Nursing homes 159 75
Homes for the elderly 157 55
Hospital care Hospitals 143 76
Health care related Organizations for ambulatory social care 159 67
Social services Social–pedagogical services 41 90
Total 1594 74

On the basis of data obtained from 106 non-respondents most of the health care sector was [75% we expected no
in three health care sectors (organizations for the elderly, for influence on the assessment of the reliability and validity in
the disabled and for the mentally ill), we compared re- this study.
spondents with non-respondents. These sectors were selected
because of the lower response. Those who participated in Validity study
the study were more likely to have a quality co-ordinator
(40% versus 2%) and to have more often formulated a quality In the separate study [16] the interpretation of the questions
policy (21% versus 16%) than those who refused. The results about activities for process improvement by the respondents
indicate that the non-respondents might have developed fewer was compared with the interpretation of an independent
quality initiatives than respondents. Because the response of researcher. It appeared that the interpretation of the questions

121
C. Wagner et al.

was sometimes different between respondent and researcher. focal areas were defined, the item-loadings were checked for
The items peer review, individual care plan, complaint re- incorrect or suspect items per health care sector. For all
gistration and client/family council were interpreted quite sectors, the component structure fitted the intended structure
well; between 64% and 79% of the activities were judged well. The SCA analysis yielded no incorrect or suspect items
equally. For the items infection and incident committees, job for home health care, care for the institutionalized disabled
assessment interviews, satisfaction survey among patients and and mental health care. Only for hospitals and for nursing
need survey, interpretation of the activities among patients homes one item tended to be suspect, although no ‘cut-off
and referrers was equal in 50% of the cases. Of the other point’ is available.
50%, half of the respondents over-reported and half under- The following section describes the focal areas in more detail.
reported their activities. There was less agreement in in-
terpretation between respondents and researcher for the items Quality assurance documents
internal audit, visitation, management information system and SCA analysis revealed one factor of nine items reflecting the
satisfaction survey among referrers and employees. For these quality documents that an organization had developed. The
last items there was more under-reporting than over-reporting; amount of explicit attention to quality management was
organizations with more than 100 employees tend to over- expressed, for example, by the development of a mission
report and smaller organizations tend to under-report. Overall statement, quality profiles, product descriptions, quality action
no upgrading tendency could be discerned in the interviews. plans and an annual quality report. Most of the provider
organizations have developed a mission statement or a quality
Focal areas policy; fewer have a quality action plan for each department;
and fewer again have an annual quality report.
Construct validity was assessed using exploratory factor ana-
lysis and multi-group confirmatory factor analysis. First, we Involvement of patients
did not constrain the extraction to a particular number of The items designed to measure involvement of patients reflect
factors; all factors with an Eigen value greater than 1.00 were one underlying latent variable. The variable, which is a specific
extracted and seven meaningful factors emerged. The data one for health care services, indicates that patients participate
were re-factored, constraining the extraction to seven factors: in quality assurance and improvement activities. The factor
this time, five factors emerged with an Eigen value greater distinguishes organizations that ask patients to make a con-
than 1.00. Based on these factors the factor structure for the tribution to developing criteria, standards and quality projects
total group of provider organizations and for six different from organizations that view quality improvement as an
health care sectors was estimated (primary health care, care organizational concern. In the literature there are distinct
for the disabled, mental health care, care for the elderly, differences between patient involvement and patient col-
hospital care and welfare care) by SCA. The five factors were laboration that form the precursors to patient participation,
confirmed, explaining 35.65% of the total variance assessing which in turn is the precursor to patient partnership. Patient
all provider organizations as one population, and 35.79% on participation and patient partnership are regarded as an ideal,
average for assessing simultaneously the six different health a goal towards which all practitioners should be working [17].
care sectors. It appeared that the total variance of the ‘forced’
SCA solution was only 1% less than the variance accounted Process control based on standards and protocols used
for by the ‘unforced’ solution. The differences in variances by professionals
accounted for by SCA and by the separate Principal Com- The eight items measuring process control formed a single
ponent Analyses per health care sector were rather small: scale. The underlying latent variable indicates that or-
2–3%. ganizations pay special attention to the development of
The factors were: (i) the nine quality documents indicating standards and protocols. After describing the health care
the dimension ‘quality assurance documents’; (ii) the six items delivery process, organizations want to minimize the variation
measuring the involvement of patients in quality improvement in their services. In standards or protocols the ideal sequence
activities; (iii) the seven items measuring ‘process control of the health care process is described. The organization can
based on standards’; (iv) out of the 20 questions about then compare what was done with what should have been
activities on selection, education and professional in- done. In recent years especially, the medical profession has
volvement 11 items were selected indicating one dimension developed standards and protocols. In organizations where
named ‘human resources management’; and (v) 14 items different health care professionals are involved in the process
measuring quality improvement activities by managers, pro- there has been a tendency to develop standards and protocols
fessionals and patients that could be characterized by using together.
the do-check-plan-act cycle. We have named this fifth factor Examples of this are standards about specific treatments
‘process improvement by quality improvement (QI) pro- or for separate groups of patients. Furthermore, there are
cedures’. These factors correspond to the focal areas of the standards describing the whole process of the patient from
existing Quality Awards (Table 1). The factor loadings of the the moment he/she arrives in the organization to discharge.
overall SCA analysis and the variance explained by each
factor are shown in Table 3. Human resources management (HRM)
Once the components that corresponded to the intended In general, people management is not new for provider

122
Evaluation of quality systems

Table 3 Factor loadings and explained variance: results of SCA analysis of 48 items of quality assurance and improvement
of all respondents (1) and for illustration the results of five sectors: home care (2), mental health care (3), care for the
institutionalized disabled (4), nursing home care (5) and hospital care (6)

Factor loadings
..................................................................................................
Items 1 2 3 4 5 6
.............................................................................................................................................................................................................................
Factor 1: Quality assurance documents
1. Quality action plan for whole organization 0.75 0.68 0.78 0.74 0.73 0.72
2. Quality policy document 0.71 0.64 0.70 0.58 0.68 0.66
3. Quality action plan for some departments 0.66 0.52 0.68 0.60 0.73 0.65
4. Quality profiles 0.64 0.60 0.59 0.66 0.66 0.55
5. Annual quality report 0.59 0.57 0.55 0.61 0.67 0.54
6. Quality action plan for every department 0.58 0.56 0.40 0.59 0.55 0.58
7. Quality handbook 0.50 0.62 0.61 0.57 0.54 0.33
8. Product descriptions 0.46 0.36 0.45 0.41 0.49 0.45
9. Mission statement 0.42 0.34 0.36 0.42 0.32 0.40

Factor 2: Involvement of patients


1. Involvement in developing quality criteria 0.80 0.70 0.84 0.79 0.72 0.64
2. Involvement in quality improvement projects 0.79 0.73 0.87 0.73 0.66 0.72
3. Involvement in quality committees 0.77 0.74 0.76 0.72 0.70 0.63
4. Involvement in evaluating quality improvement goals 0.75 0.71 0.72 0.71 0.72 0.62
5. Involvement in developing standards 0.74 0.66 0.71 0.68 0.66 0.59
6. Involvement in meetings talking about results of satisfaction 0.73 0.66 0.73 0.77 0.59 0.61
surveys, complaints registration

Factor 3: Process control based on standards


1. Standards for specific treatments/interventions 0.68 0.70 0.47 0.69 0.69 0.54
2. Standards for utilization of medical equipment 0.63 0.61 0.57 0.60 0.62 0.59
3. Standards for patient education 0.62 0.43 0.61 0.57 0.65 0.55
4. Standards for co-operation with other organizations 0.61 0.70 0.58 0.61 0.51 0.54
5. Standards for restricted medical actions 0.59 0.65 0.55 0.62 0.54 0.61
6. Standards for specific target groups 0.57 0.64 0.60 0.58 0.48 0.56
7. Standards for critical moments in service provision 0.55 0.44 0.56 0.56 0.49 0.51
8. Standards for patient routing from intake to discharge 0.55 0.61 0.55 0.53 0.58 0.51

Factor 4: Human resources management


1. Training/education of management 0.62 0.60 0.62 0.63 0.67 0.65
2. Training/education of professionals 0.60 0.59 0.59 0.71 0.60 0.59
3. Management checks whether professionals stick to 0.58 0.59 0.59 0.55 0.59 0.55
commitments
4. Continuous education takes place based on priorities in 0.57 0.50 0.52 0.61 0.47 0.64
quality policy
5. Professionals are allowed to participate in QA-activities 0.56 0.56 0.52 0.59 0.58 0.57
within regular working hours
6. Professionals are stimulated to develop themselves in their 0.53 0.47 0.41 0.49 0.51 0.59
profession
7. Management indicates what is expected from professionals 0.53 0.40 0.70 0.52 0.51 0.48
with respect to quality assurance
8. Training new professionals in quality improvement methods 0.52 0.55 0.48 0.54 0.51 0.61
9. Systematic feedback to professionals about achieved results 0.52 0.48 0.44 0.41 0.50 0.46
10. Monitoring department action plans 0.51 0.44 0.53 0.51 0.56 0.33
11. Selection of new personnel with positive attitude to quality 0.46 0.51 0.57 0.41 0.261 0.33
assurance
Continued

123
C. Wagner et al.

Table 3 Continued

Factor loadings
..................................................................................................
Items 1 2 3 4 5 6
.............................................................................................................................................................................................................................
Factor 5: Process improvement based on QI-procedures
1. Satisfaction survey among patients 0.57 0.52 0.31 0.37 0.54 0.73
2. Utilization of individual care plans 0.56 0.66 0.49 0.58 0.61 0.45
3. Satisfaction survey among employees 0.56 0.52 0.65 0.57 0.48 0.63
4. Internal audit 0.56 0.45 0.43 0.36 0.43 0.66
5. Complaint registration 0.56 0.57 0.38 0.57 0.53 0.66
6. Need survey among referrers or others 0.55 0.39 0.50 0.43 0.56 0.74
7. Job assessment interviews 0.54 0.52 0.60 0.67 0.46 0.47
8. Need survey among users 0.53 0.50 0.41 0.56 0.53 0.53
9. Management information system 0.51 0.48 0.61 0.52 0.58 0.32
10. Satisfaction survey among referrers 0.51 0.47 0.58 0.47 0.49 0.70
11. Peer review multi-disciplinary 0.48 0.40 0.53 0.39 0.36 0.50
12. Peer review mono-disciplinary 0.47 0.49 0.34 0.34 0.51 0.39
13. Committees e.g. incident, infection or drugs committees 0.47 0.50 0.40 0.54 0.50 0.57
14. Visitation 0.37 0.32 0.25 0.30 0.24 0.271

Explained variance 35.65 34.86 36.59 36.96 34.93 35.11


1
The item-loading is higher or equal for another focal area.

organizations. Only the explicit link between quality man- ‘QA-documents’, ‘human resources management’ and ‘QI-
agement and people management is a new phenomenon. The procedures’.
items measured how provider organizations paid attention to
the involvement of their professionals in quality assurance Developmental stages
and improvement.
Within each focal area we have divided all the items into
the four developmental stages: orientation and awareness,
Process improvement by QI-procedures
preparation, experimentation and integration into normal
The 14 items designed to measure process improvement
business operations. The division is shown in Figure 1.
reflect a single latent variable. The variable indicates that
In stage zero, which is called orientation and awareness,
organizations have developed different quality improvement
there are no systematic activities for quality assurance and
activities in a systematic way for professionals, managers and
improvement of health care processes. Some disciplines mon-
patients. Examples are peer review, committees, management
itor their own quality through peer review and the use of
information systems, client council, and need and satisfaction
standards for specific treatments. The management has started
surveys.
describing the mission, vision and products of the institution.
In this stage, the professionals are mainly responsible for
Reliability
quality assurance. In the preparation stage, organizations
Table 4 shows internal consistency reliability estimates (co- create the conditions necessary for systematic quality as-
efficient Cronbach’s a) for each of the five focal areas. surance and improvement activities; examples are education
All focal areas achieved reliability (i.e. Cronbach’s a ≥0.75) on quality management methods for management and pro-
above the standard of 0.70 recommended by Nunnally [18]. fessionals, the development of a quality policy and standards
Further assessment of index reliability was conducted with emphasizing health care processes. In the third stage provider
health care sectors and organization sized subgroups. Several organizations develop different kinds of quality improvement
differences emerged, although each of the subgroups achieved projects and experiments. The purpose is to cross the bound-
acceptable internal consistency. aries of separate disciplines using quality cycles. Finally,
The dimensions ‘involvement of patients’ and ‘process organizations reach the stage of integration and establishment.
control based on standards’ were less reliable among hospitals. Quality improvement is no longer an experimental activity,
On the other hand the dimension ‘process control based on but is integrated into normal business operations. The results
standards’ was more reliable among organizations for care of quality improvement activities in one focal area will be
for the elderly. Finally, there were no large differences within used for changes in other focal areas and so it is necessary
health care sectors or organization size for the dimensions that organizations develop activities in more than one focal

124
Evaluation of quality systems

Table 4 Internal consistency coefficients of the five focal areas of quality systems in health care sectors and organization
size

Cronbach’s a
........................................................................................................................................
QA Patient QI
Group n documents involvement Standards HRM procedures
.............................................................................................................................................................................................................................
All respondents 1182 0.76 0.85 0.74 0.76 0.79

Health care sectors


Primary health care 247 0.76 0.77 0.72 0.74 0.72
Care for the disabled 286 0.77 0.87 0.72 0.75 0.80
Mental health care 191 0.78 0.87 0.73 0.77 0.72
Care for the elderly 206 0.82 0.79 0.80 0.77 0.81
Hospital care 109 0.69 0.69 0.66 0.77 0.82
Welfare care 143 0.73 0.85 0.72 0.79 0.79

Organization size1
Fewer than 50 employees 383 0.78 0.87 0.72 0.76 0.80
From 51 to 300 employees 432 0.78 0.84 0.73 0.77 0.72
More than 300 employees 290 0.72 0.86 0.73 0.77 0.77
1
Organization size not known for 77 organizations (6%).

Focal QA-documents Patient Process control Human Resources QI-procedures


areas involvement based on Management
standards
Stages

Stage 0: ● mission ● patient is not standards for: ● encouraging ● using care plans
Orientation ● product involved ● specific professional ● peer review
description treatment development

Stage 1: ● quality policy ● discussions of standards for: ● training staff ● complaints registration
Preparation ● institutional results ● patient education ● training ● committees
stage quality plan ● discussion of the ● specific target professionals ● job assessment
● quality profiles targets achieved groups ● participation during interviews
● unforeseen working hours
activities ● management
● medical aids indicates activities

Stage 2: ● quality plan for sometimes standards for: ● management tests ● satisfaction research
Implementation some involvement in: ● critical moments ● management ● needs analysis
stage departments ● committees ● cooperation with monitors
● quality plan for ● QI-projects other ● specific criteria for
all departments ● Development of organisations selection of new
criteria/protocols staff

Stage 3: ● annual quality systematic standards for: ● systematic ● management


Establishment report involvement in: ● routing patient feedback information system
● quality manual ● committees ● priorities relating to ● internal audit
● QI-projects quality policy ● visitation
● development of ● training new staff
criteria/protocols

Figure 1 Indicators for the achievement of development stages for quality systems in health care by focal area.

125
C. Wagner et al.

Table 5 Percentage of organizations per developmental stage that have developed at least one of the activities of the earlier
stages

Patient QI-
QA-documents involvement Standards HRM procedures
Organizations stage 1
Satisfied to stage 0 (%) 100 84 79 90 97

Organizations stage 2
Satisfied to stage 0 (%) 100 69 83 84 98
Satisfied to stage 1 (%) 89 89 83 84 99

Organizations stage 3
Satisfied to stage 0 (%) 100 34 80 84 99
Satisfied to stage 1 (%) 93 80 88 95 99
Satisfied to stage 2 (%) 65 78 79 84 80
Per cent of organizations 88 71 72 73 91
satisfying all earlier stages

area simultaneously. We have analysed the correlation co- additional methods for data gathering should be used in order
efficient of the different focal areas. Between all focal areas to have some independent validation and proof of the
we have found weak, but significant, positive correlations. functioning of quality systems. Until now no independent
Theoretically an organization has reached a particular de- public assessments on a broader scale have been carried out
velopmental stage if it has developed at least one quality in The Netherlands. On the contrary, in the USA it is possible
improvement activity of that stage and the quality im- to correlate the data gathered by the instrument described
provement activities of the earlier stages. We assessed how here with data from the reviews of the Joint Commission on
many organizations had followed the stages in the postulated Accreditation of Health Care Organizations. In the UK it
order (Table 5): most of the organizations in stage three have was possible to correlate the data with data from or-
actually developed the activities of the two earlier stages; ganizational audits performed by the King’s Fund.
more than 80% of the organizations in stage two have The results show that the measured quality improvement
developed the activities of stage one. Overall, linear de- activities of provider organizations can be divided into five
velopment was found more often in the focal areas process focal areas: QA documents, involvement of patients, process
improvement by QI-procedures and QA-documents than in control based on standards and protocols, human resources
the other focal areas. management, and process control by QI procedures. Our
findings of the focal areas confirm, in part, the areas of an
organization differentiated in the literature that experts believe
Discussion to be essential for delivering care of consistently high quality.
The empirical data suggest one new area for provider or-
This research attempted to assess the reliability and validity ganizations: the area of patient involvement and participation.
of an instrument to measure the developmental stage of These findings are in agreement with ideas about the different
quality systems across health care sectors and health care- position of patients/consumers in health care and industry
related social services. Much attention has been paid to the and the growing attention to enforcing the rights of consumers
validity of the instrument. The questions were formulated in health care.
in co-operation with experts on quality improvement and The development of a quality system is complex and takes
representatives from different health care organizations (con- many years. In conformance with innovation theory, many
tent validity). Furthermore, we analysed subsamples sim- provider organizations choose a step-by-step strategy. The
ultaneously by multi-group confirmatory factor analyses results of our research confirm this approach; four de-
(SCA). The analysis showed that the empirical data confirm, velopmental stages could be distinguished: orientation (stage
for the different health care sectors, the focal areas we found 0), preparation (stage 1), implementation (stage 2) and es-
in an overall factor analysis, which means that the same tablishment (stage 3). The results suggest few differences in
focal areas can be distinguished across health care sectors. reliability across sectors and organization size. The expected
Questions about the criterion validity were addressed in the linear development through the four stages was followed
separate validation study, which shows that there has been by most of the provider organizations. The number of
some over- as well as under-reporting. In future research the organizations that have developed otherwise differs across
opinions of professionals should be taken into account, and the focal areas.

126
Evaluation of quality systems

We may conclude that the survey instrument can be 3. EFQM, FQM. The European Model for Self-Appraisal. Brussels
used for assessing, at a global level, the extent to which EFQM, 1992.
organizations work on quality assurance and improvement. 4. Øvretveit J. A comparison of approaches to health service quality
By assigning the activities to focal areas and developmental in the UK, USA & Sweden and of the use of organizational audit
stages the instrument gives an overview of the various frameworks. Eur J Publ Health 1994; 4: 46–54.
elements of a quality system. This study has shown that the
5. Hertz HS, Reimann CW, Bostwick MC. The Malcolm Baldrige
instrument links up with existing international quality awards National Quality Award concept: Could it help stimulate or
and that it can be used across health care sectors as well as accelerate health care quality improvement? Qual Manage Health
for different kinds of organizations, e.g. large university Care 1994; 2: 63–72.
hospitals, relatively small health care centres, homes for the
elderly where the living environment is emphasized, and in 6. The Netherlands Quality Institute (INK). Handbook positioning
& improvement non-profit organizations (in Dutch). ‘s-Her-
organizations for public health care where the patient has
togenbosch: Instituut Nederlandse Kwaliteit, 1996.
only brief contact with the organization. Thus we assume
that the instrument can be used in other countries as well. 7. Mann FC, Neff FW. Managing Major Change in Organizations. Ann
Developing quality systems provide a common language Arbor, MI: Foundation for research on human behaviour, 1961.
across all parts of the health care sector. 8. Crosby PB. Quality is Free: the Art of Making Quality Certain.
The instrument is applicable to different groups. For New York: McGraw-Hill, 1997.
provider organizations in the USA and in Europe the in-
9. Hage J, Aiken M. Social change in complex organizations.
centives for paying more attention to quality result from
Alphen aan den Rijn: Samsom publisher, 1980.
competition for patients, an increasing demand, and the need
to contain and reduce costs to win contracts from purchasers. 10. Hardjono TW, Hes FW. The Dutch Quality Award (in Dutch).
Provider organizations can report the information about the Deventer: Kluwer, 1993.
development of quality systems in their annual quality report 11. Shortell SM, O’Brien JL, Carman JM et al. Assessing the impact
and increase the transparency and accountability of the or- of continuous quality improvement/Total Quality Management:
ganization for health care purchasers, and for inspection by Concept versus Implementation. Health Serv Res 1995; 30: 377–
patients. Second, by using the instrument state regulators 401.
can more easily gather comparable data to evaluate the
12. Lammers JC, Cretin S, Gilman S, Calingo E. Total quality
development of quality systems in provider organizations. management in hospitals: the contributions of commitment,
Finally, the approach of the instrument is efficient (not time quality councils, teams, budgets, and training to perceived
consuming) and therefore useful for monitoring purposes. improvement at veterans health administration hospitals. Med
To improve the validity, the data gathered by the instrument Care 1996; 34: 463–478.
can periodically be compared to data from accreditation
13. Jennings K, Westfall F. A survey-based benchmarking approach
reviews or organizational audits. The instrument complements
for health care using the Baldrige quality criteria. J Qual Improve
rather than substitutes for existing accreditation and audit 1994; 20: 500–509.
methods.
14. Hammershøy E, Mainz J. Ulrichsen H. Quality of care activities
in Danish hospitals. Qual Manage Health Care 1994; 3: 63–69.
Acknowledgement 15. Graz B, Vader JP, Burnand B, Paccaud F. Quality Assurance in
Swiss University Hospitals: a survey among clinical department
The research on which this article is based was supported heads. Int J Qual Health Care 1996; 8: 271–277.
by a grant from the Ministry of Health, Welfare and Culture. 16. Miltenburg I. Research on the reliability and validity of questions
We thank D. Doeglas for his assistance in the SCA-analysis. about quality assurance and quality improvement (in Dutch).
Utrecht: Utrecht University, 1995.
17. Cahill J. Patient participation: a concept analysis. J Adv Nurs
References 1996; 24: 561–571.
18. Nunnally JC. Psychometric Theory. New York: McGraw-Hill, 1978.
1. Boyce N. Using outcome data to measure quality in health care.
Int J Qual Health Care 1996; 8: 101–104.
2. Silberman P. Ensuring quality and access in managed care: how
well are we doing? Qual Manage Health Care 1997; 5: 44–54. Accepted for publication 28 October 1998

127
C. Wagner et al.

Appendix
Quality policy
1. Does your organization have one or more of the mentioned documents?

Documents In
No development Yes

Written mission statement: the vision and priorities of the organizations

Product descriptions: detailed description of the care for different patient


populations

Quality profiles: concrete descriptions of quality characteristics and quality


standards of health care delivery

Quality policy document: a description of the aims of quality assurance,


the desired level of care delivery and the ways of the organizations for
achieving these goals

Quality action plan for whole organization: written document with measures
for implementation and planning of action to realize quality goals

Quality action plan for some departments

Quality action plan for every department

Annual quality report: a report on all activities that were performed to


assure the quality of care and the results of the activities

Quality handbook: a description of all procedures that the organization


uses for quality assurance and the persons that are responsible for the
compliance with the procedures

Explanation:
In development, one or more persons of the organization are working on the development of the document.

Conditions and human resources management


2. Does your organization have/make special provisions for the implementation of activities of quality assurance/
improvement? (more than one answer is allowed)
Φ No special provisions
Φ Training/education of staff/management
Φ Training/education of professionals
Φ Professionals are allowed to participate in QA-activities within regular working hours
Φ Appoint a quality coordinator
Φ Set up a steering committee
Φ Set up quality working groups
Φ Budget for quality management
Φ Support by consultants

128
Evaluation of quality systems

3. Is there a relationship between human resources management and the quality policy in your organization? (more than
one answer is allowed)
Φ Does not apply
Φ Selection of new personnel with positive attitude to quality assurance
Φ Training new professionals in quality improvement methods
Φ Continuous education takes place based on priorities in quality policy
Φ Professionals are encouraged to develop themselves in their profession
Φ Participation in quality improvement projects is required

4. How does the management stimulate the involvement of professionals in quality assurance/improvement? (more than
one answer is allowed)
Φ Does not apply
Φ Stimulation is not necessary, professionals pay enough attention to quality assurance/improvement
Φ The management indicates what is expected from professionals with respect to quality assurance
Φ Management checks whether professionals stick to commitments
Φ Systematic feedback to professionals about results achieved
Φ Management gives incentives
Φ Monitoring department action plans
Φ Sanctions, namely............................................................................................................................................................................

Standards
5. What kind of standards do professionals use in your organization? (more than one answer is allowed)
Φ Standards for specific treatments/interventions
Φ Standards for patient education
Φ Standards for restricted medical actions
Φ Standards for utilization of medical equipment
Φ Standards for critical moments in service provision
Φ Standards for specific target groups and diagnoses
Φ Standards for patient routing from intake to discharge
Φ Standards for co-operation with other organizations

Patient involvement
6. In what way are patients (or patient organizations) involved in quality assurance or improvement activities in your
organization?
Activities No/does Depends on
not apply the subject Always

Developing quality criteria

Developing protocols/standards

Meetings talking about results of satisfaction surveys,


complaints

Quality committees

Quality improvement projects

Evaluating quality improvement goals

129
C. Wagner et al.

Quality assurance and improvement activities


7. Does your organization apply the following activities on a regular, systematic basis? (e.g. Deming cycle: plan, do, check,
act)
Activities No∗ Yes∗ Cycl∗ Syst.∗

Peer review monodisciplinary

Peer review multidisciplinary

Utilization of individual care plans

Committees e.g. incident, infection or drugs committees

Job assessment interviews

Internal audit

Visitation/accreditation

Management information system

Satisfaction surveys among patients

Satisfaction survey among referrers

Satisfaction survey among employees

Need survey among patients

Need survey among referrers or other stakeholders

Complaint registration

Patient council

Other activities, namely:

∗Explanation:
No =no/does not apply;
Yes =the activity is not applied on a regular basis
Cyclic =the activity is applied based on a quality improvement cycle
Systematic =the activity is applied based on a quality improvement cycle and the activity is integrated into normal
business routines.

130

View publication stats

You might also like