section_8b_economics_of_quality_notes

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

8

Section

WARWICK MANUFACTURING GROUP


Quality Management

The Economics
of Quality
QUALITY MANAGEMENT

The Economics of Quality

© Warwick Manufacturing Group


School of Engineering
University of Warwick, Coventry, CV4 7AL, UK
Phone +44 (0)24 7652 4240 • Fax +44 (0)24 7652 4307
e-mail paul.roberts@warwick.ac.uk
Table of Contents
Introduction 1

The uses of quality costs 1

Quality costs ate my profits! 2

Definition of quality cost categories 2

Setting up a quality costing system 4

Determining POC’s and PONC’s 5

The Taguchi loss function 7


T H E E C O N O M I C S O F Q U A L I T Y
T H E E C O N O M I C S O F Q U A L I T Y

8
Section

The Economics of Quality

Introduction

T raditionally, it has been common for the measures of quality within industry
to be in the form of indices such as: percentage scrap, rework hours,
number of warranty claims etc. These measures have not proved
themselves to be an effective report to management. It is difficult to get worked
up about fractions of percentage points! The tendency to roll figures together has
also been instrumental in muddying the water. If we create an agglomeration of
percentage measures of scrap, rework and warranty claims, who is to say if a figure
of 0.65 is good or bad? At what point do we get excited? At what point do we get
mad?

The result of the confusion caused by the use of somewhat arbitrary indices for the
measurement of quality has been the low priority given to quality issues compared
to those issues for which hard cash figures have been available. It is, after all, not
unreasonable for management to concentrate on areas where they can identify the
financial consequences of their actions.

The challenge, then, was to represent quality issues in a more meaningful way for
management action (this means in terms of money!). Businesses are all about
money and this reporting format ensures that quality is on the same footing as
other major issues. Armand V. Feigenbaum took up this challenge in his book
“Total Quality Control” published in 1951. The American Society of Quality
Control (ASQC) added a publication called “Quality Costs - What and How” and
latterly the BSI have published BS 6143 “Guide to the Determination & Use of
Quality Related Costs”. These all give advice, although it is sometimes conflicting,
on setting up and running quality costing systems.

The uses of quality costs

F irstly, it is used as a method of attracting management’s attention and to


demonstrate the need for Quality improvement. It is important to note at
this stage that any reduction in Quality Costs will be added straight onto the

1
©Warwick Manufacturing Group
Revision date: October 06
T H E E C O N O M I C S O F Q U A L I T Y

bottom line of the organisation since they all represent money spent doing things
wrong or doing the wrong thing.

Secondly, it can be used to pinpoint areas where action needs to be taken, prioritise
the areas for action and give an indication of the resources that should be
committed to improvement in that area.

Thirdly, it can be used to measure improvements in given areas and as an ongoing


measure of quality performance.

Quality costs ate my profits!

R ecent surveys have indicated that average Quality Costs in manufacturing


range from 20 to 40% of the turnover of the company. This very quickly
runs into a lot of money, and remember- it’s straight off the bottom line.
These figures may seem high at first glance, if you have trouble accepting them do
a small exercise of noting how much of your working day is spent doing activities
which are only required because someone (either yourself or someone else) did
something wrong further up the line. This is often a very salutary experience. The
costs in service industries have been shown to be around 40 to 50% of the cost of
operations.

A survey by UMIST in the late 1980’s showed that less than 35% of companies are
aware of their Quality Costs and less than that actually employed a Quality Costs
system as prescribed in the literature. A few companies had no identification of
basic costs such as scrap and rework. It is unlikely that the situation has improved
significantly in the intervening few years.

Definition of quality cost categories

T he most common categories used (and those defined by the authorities


cited earlier) are as follows:

Prevention: “The costs of any action taken to investigate,prevent or reduce defects


and failures." (BS 6143)
Appraisal: The costs of inspecting, testing or otherwise checking to detect defects
or failures if they are present.
Failure: The costs incurred when something has not been done “Right First
Time”. These costs may be split into internal and external.

Prevention costs will include things like: training, quality planning and
administration, any costs incurred in running quality circles and supplier
development activities. Appraisal costs will involve: inspectors time; purchase,
running and calibration costs of inspection equipment; report approval etc. Failure
costs will include: scrap, rework, design changes to solve problems created by the
primary design (all internal costs) and warranty (external).

2
©Warwick Manufacturing Group
Revision date: October 06
T H E E C O N O M I C S O F Q U A L I T Y

These categories appear simple enough at first glance. However, there are, in fact,
large grey areas and much debate is to be had about activities such as first off
inspection (prevention or appraisal?), dealer PDI checks (failure or appraisal?)etc.
Whilst there may or may not be a ‘correct’ answer it is very important that these
issues are addressed at an early stage and a consistent approach adopted.

Quality Costs within an organisation will tend to be broken down as in the figure
shown below. The exact percentages will vary from company to company but the
general form will remain the same. Similarly, the general point that a correctly
targeted increase in prevention costs will produce a more than compensating
reduction in failure and appraisal costs is true although the ratio will vary from case
to case.

Overall Reduction
(both appraisal &
failure reduced)

Original Quality Cost After increase in


Profile Prevention Costs

Failure Cost

Appraisal Cost

Prevention Cost

Figure 1. Typical quality cost profile.

The illustration above shows the second weakness of the traditional Quality Cost
system, that is the complexity of determining the effect of an increase in prevention
spending on the other categories.

Phil Crosby believed that the splitting of quality costs into three categories was too
complex and unnecessary. He proposed two simple categories based upon the idea
that the definition of Quality was “conformance to requirements”, where the
requirements were set by the customer (internal or external, as appropriate). He

3
©Warwick Manufacturing Group
Revision date: October 06
T H E E C O N O M I C S O F Q U A L I T Y

argued that, if you were not undertaking normal business activities, you were taking
action either to ensure that something was right first time or to deal with the fact
that it had not been done right first time. The former he called “The Price of
Conformance” (POC) and the latter, which includes inspection since this is done
when we are not sure we have done it right first time, “Price of Non-
Conformance” (PONC).

In this simplified scenario the categorisation process is easier to achieve and the
“bad costs” (PONCs) can be easily separated from the “good costs” (POCs) and
the interaction more clearly seen.

The choice of system is entirely up to the individual organisation but, again,


consistency is the watchword: don’t mix the two systems. The principal advantage
of the Crosby approach is it’s simplicity whilst the BS 6143 method gives a greater
definition to the cost breakdown.

Setting up a quality costing system

T he first point to be made about establishing a Quality Costing system is that


the Finance department should produce it. Naturally, the Quality
department should give guidance on the nature and construction of the
system in the initial stages but the running of the system and the reports issued
should clearly belong to the Finance department. The reasoning behind this is two-
fold: firstly, they have all the cost centre data and secondly, they are generally seen
as impartial in most organisations.

We should note at this point that Quality Costing documentation does not form
part of the normal accounting structure. It can be seen as an alternative view of the
disposal of finance and fulfils the role of a management-reporting document to
target waste in the business; a function not adequately covered by conventional
accounting structures.

The level of detail to which Quality Costs are taken is very much a matter for the
individual organisation. It is frequently argued that the Quality Cost report should
only take the data that currently exists and collate it into a different structure rather
than establish an entirely new system. This may be possible in a system that has a
high level of detail in its present form but may be somewhat problematical if there
is insufficient depth. A Quality Cost system requires a high level of definition so
that costs can be attacked appropriately, too little detail may obscure the actions
required for maximum benefit. To be weighed against this is the additional effort
and complexity engendered by operating a system of Quality Costing that is all-
embracing. The principal rule to apply is to keep the system as simple as possible
whilst fully satisfying the organisation’s requirements.

The Quality Cost system may be organised company-wide or on a decentralised,


departmental basis. As a tool for monitoring performance it is very useful to each
departmental manager and the decentralisation may well be justified from this point
of view. However, one important fact to remember is that Quality Costs may

4
©Warwick Manufacturing Group
Revision date: October 06
T H E E C O N O M I C S O F Q U A L I T Y

show up in different departments to the one where the problem exists (and thus
the one capable of resolving it), this can lead to departments taking the “blame-
thrower” approach if a negative management culture, as discussed in earlier
sections, is present. Much care may, therefore, be necessary to allocate costs to
source rather than where they become visible. The correct environment for using
Quality Costs is one where the emphasis is on continuous improvement.

A pilot study of Quality Costs in a part of the organisation may be a useful starting
point before going on to develop a company-wide system incorporating any
lessons learned in the process.

Very careful examination of what is to count as a Quality Cost and what is not
should be undertaken at the planning and implementation phases. Do we include
overheads for example? The conventional answer is ‘no’ since most of the costs
we are dealing with are themselves overheads. Do we include the costs of lost
customers? Again, the conventional answer is ‘no’ since this would largely be
guesswork. Do we include the redundant material caused by a design change?
Here, the answer is usually ‘yes’ since it was the original error that caused redundant
material to be made. Consistency is the watchword; apply the same logic to each
situation as it arises.

The last paragraph illustrates the problem with Quality Costing; it is somewhat
subjective and does not fully include all potential consequential costs (the sort of
costs such as lost customers which Deming describes as “Unknown and
Unknowable”). It is, however, better than the ignorance in which many
organisations currently live and, if you understand the shortfalls can still provide a
very powerful tool for driving and monitoring continuous improvement.

Determining POC’s and PONC’s


Firstly, extensive training must be done down to at least foreman level. These are
the people who will be collecting the data and who must, therefore, be able to
distinguish between the categories. The following principles must be applied:

Review all activities in the prescribed area and determine into which of the three
categories below they fall:

ƒ They are done because something has not been done right the first
time (PONC).

ƒ They are done to ensure that subsequent activities are done right
the first time (POC).

ƒ They are part of the way in which we would normally run the
business if everything was done right the first time.

It is important not to forget the third category, as the temptation is then to


categorise everything as either a POC or a PONC. A mistake made by a major
consulting firm in my experience that led to them presenting a report, which
indicated that, no one in a several thousand strong organisation was actually adding

5
©Warwick Manufacturing Group
Revision date: October 06
T H E E C O N O M I C S O F Q U A L I T Y

value to the organisation! This was clearly a nonsensical statement that damaged
the initiative to evaluate Quality Costs almost irreparably.

The first two categories should be measured in some sensible way; hours, heads etc.
in addition to material losses. This information should be passed to the accounts
department for financial evaluation and then collated as the Total Cost of Quality.

The subsequent breakdown of the figures should be calculated to assist the process
of improvement in the organisation. This may vary from organisation to
organisation but will generally be devolved to departmental level with the focus on
major PONC’s.

You may find that after several months of intensive


A W O R D O F
improvement effort you reassess your Quality Costs and find
W A R N I N G to your chagrin that, contrary to the expected reduction, they
are constant or have actually increased. This is not a sign of the abject failure of the
system, nor the ineptitude of those operating it, it is merely an indication that you
have got better at identifying Quality Costs and so those that slipped through the
net the first time have been captured this time. Ensure that you are comparing
apples with apples and calculate back, if possible, any newly discovered costs to the
original data. This effect may also be due to the feed-through time for increased
spending on POC’s to reduce PONC’s. This will exhibit itself as a change in the
relative proportions of these costs.

If neither of the above reasons accounts for the increase you will need to revisit the
system design to see where the process is failing. The most likely place is the
translation of the data into improvement actions.

6
©Warwick Manufacturing Group
Revision date: October 06
T H E E C O N O M I C S O F Q U A L I T Y

The Taguchi loss function

G enichi Taguchi is a Japanese engineer who has become one of the most
recent additions to the Quality “Hall of Fame”. He has attained pre-
eminence due to two significant contributions to the field of Quality: the
first is the so-called ‘loss function’ which will be discussed below and the second is
off-line QC involving a novel approach to the design of experiments which is not
dealt with in this module.

The loss function as defined by Taguchi is basically a challenge to the traditional


ideas on what constitutes acceptable quality for manufactured products. The figure
below shows the contrast between Taguchi’s loss function and the traditional
tolerancing approach to product quality.
Traditional Tolerancing

BAD GOOD BAD

Lower Upper
Tolerance Tolerance
Cost

Measured Characteristic
Taguchi Loss Function
Cost

Measured Characteristic

Figure 2. Loss function vs tolerancing.

Traditional thinking is that any product that falls inside the tolerance limits is
“good”. The unspoken assumption here is that they are equally good and that no
cost is incurred. Following the logic through we can see that any product falling
outside the limits is bad and a cost equivalent to the full cost of producing that
product is incurred (often referred to as the scrap cost). In this simple scenario we

7
©Warwick Manufacturing Group
Revision date: October 06
T H E E C O N O M I C S O F Q U A L I T Y

have assumed that reworking the product is either not possible or is uneconomic.
Again, the hidden implication is that all products outside the limits are equally bad.

The usual derivation of tolerances further throws this attitude into doubt. They are
usually based upon what was done last time or the draughtsman’s ‘best guess’.
There also exists an element of barter in the generally adversarial relationship
between design and production with designers wanting to tie production down to
extremely tight tolerances and manufacturing wanting to be able to drive a bus
through them. Seen in this light tolerances can be viewed as, at best, somewhat
arbitrary.

Taguchi states that to regard the transition from good to bad as a step change is not
logical. He contends that, provided the nominal has been specified correctly, any
deviation from this target value will have a detrimental effect on the performance
of the product and will therefore cause an overall “loss to society”. This concept is
probably one of the more esoteric of Taguchi’s ideas. A good example may be to
consider the thickness of a polythene sheet used by farmers to protect crops; if the
sheet thickness is low (but within tolerance) it may tear more easily and allow the
weather to damage the crops. The costs generated by this failure will be outside the
company but very real. Firstly, farmers will incur additional replacement costs;
secondly, the reduced crop yield will increase the price in the marketplace, a loss
borne by all society. Since the loss is usually greater than the company’s gain from
selling a poorer product the Japanese have coined the phrase “worse than a thief”
which refers to the fact that when a thief steals money only his victim is worse off
and not society as a whole.

In many cases it is easier to think of the “loss to society” in terms of a long-term


loss to the company. The reduced performance of the product caused by non-
optimal parts will cause relative dissatisfaction in customers who will, given
sufficient stimulus, take their trade elsewhere. The further from optimum
performance we deviate the quicker will be their defection.
C B A A B C

Sony, Japan

Sony, USA

Japan: Normal distribution, 0.3% defective


USA: Uniform distribution, 0% defective

C B A A B C

Figure 3. Sony TV production.

8
©Warwick Manufacturing Group
Revision date: October 06
T H E E C O N O M I C S O F Q U A L I T Y

The above diagram is an illustration of the loss function as a long-term loss to the
company, and appeared in a Japanese newspaper called “The Asashi” in 1979. The
article discussed the preference of American consumers for television sets built by
Sony in Japan over those built at an identical plant in the USA. The key
performance characteristic is colour density. The ‘A’ band represents excellent
colour density; the ‘B’ band good colour density and the ‘C’ band acceptable colour
density. Outside of the limits of the ‘C’ bands colour density is deemed
unacceptable and the TV is considered a reject.

Clearly from the figure, Sony Japan was producing defects whilst the American
plant was not. However, the key fact is that the chances of having an ‘A’ or ‘B’
grade TV from the Japanese plant was much greater than the American one where
the odds of getting any grade were roughly similar. The “in tolerance is OK”
attitude was costing a lot of sales for the American plant. The Taguchi belief that
variation from the nominal is expensive seems much closer to the truth in this case.

Taguchi states that the loss function takes the quadratic form shown above for all
“nominal the best” type characteristics and the appropriate half of that shape for
“bigger the better” and “smaller the better” features. Whether this is in fact strictly
the case is debatable. However the principle that deviation from the target is
expensive regardless of tolerances and that the rate of deterioration of the situation
increases with distance from the target is sensible.

The principal implications of the loss function for the way we do business is that it
gives a financial credibility to the argument for continuous improvement of
processes beyond the current specification limits and it firmly indicates the
reduction of variation in our processes as the way to increased profitability and
success. In addition to this, if we are able to quantify the important constant in the
loss function, we can see in financial terms the benefits of reducing the variation in
a process and set this against the cost of doing so. This gives a very good basis for
evaluating improvement projects either in absolute terms or against other projects
competing for the same funding.

It is, however, difficult to accurately define the equation for the loss function for
any given process the general form taken is shown below but the constants are
difficult to determine and are frequently underestimates due to the unpredictability
of consequential losses.

L(y) = k(y-m)2

Where:

L(y) is the financial loss (in £) when the quality characteristic is y.

m is the target value of y

k ........... is a constant for the given characteristic. It depends upon the importance
of y.

9
©Warwick Manufacturing Group
Revision date: October 06
T H E E C O N O M I C S O F Q U A L I T Y

µµµ

10
©Warwick Manufacturing Group
Revision date: October 06

You might also like