Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

SPC Made Easier, Simpler, More

Statistically Powerful

Keki R. Bhote
n the search for world class qualiImanagement,
ty, there must be synergy between
customers, suppliers,
and employees. Organization, systems, and training of the right kind
are essential catalysts. A blueprint
of improvement in these general
areas is beyond the scope of this
article. Instead, it concentrates on
easy, simple, cost effective, and statistically powerful tools that can improve product quality, not by an
inconsequential 10 percent, 50
percent, or even 100 percent, but by
factors of 10:1 and 100:1 in a short
period of time. If quality is the centerpiece for a company's resurgence, these statistical tools are the
piece de resistance, within quality.
The Traditional, Ineffective Quality
Tools
It is common knowledge that
Japan drew even with the U.S. in
quality progress by the mid 1960s
and has gone on to command an
ever-widening lead. What is less
well-known are the quality tools
used by the two countries. The U.S.
persevered in the traditional tools of
inspection and sorting, detection
and correction, fire fighting, exhortation, and the crutch of sampling
plans, at least until 1980. These
tools consume costs but add little
value.
Conventional SPC Tools - Too
Little, Too Late
The Japanese, on the other
hand, abandoned these ineffective
tools and adopted statistical process
control (SPC) in the 1950s, under
the coaching of Dr. W. Edwards
Deming. They rode the crest of the
SPC wave until the mid 1970s,

when they realized that SPC in production was too little and too late.
Paradoxically, the U.S. rediscovered
SPC at the same time that the Japanese were discarding it. (SPC was
widely used in the U.S. during World
War II, but then the baby was
thrown out with the bathwater, in a
revolt against the mysteries of the
world of statistics.)
"If Japan Can, Why Can't We"
What re-ushered the SPC age
in America was the airing of the
NBC White Paper-"If Japan Can,
Why Can't We." It gave the American public at-large its first glimpse of
the reasons behind Japan's success: Quality, in general, and SPC
in particular. Deming was rescued
from the U.S. quality wilderness and
elevated to a prophet within his own
country. Major companies scurried
to jump on the SPC bandwagon.
Unfortunately, SPC has become
synonymous with control charts.
Control charts are complex, costly,
and almost useless in their ability to

... Control charts are


complex, costly, and almost
useless in their ability to solve
quality problems.
solve quality problems. The most
charitable rationale for their existence is as a maintenance tool, to
be used only after a process is
brought under control with more
powerful diagnostic techniques.
Problem Solving-U.S. Style.
Along with control charts, the
U.S. began to use other less-thaneffective problem-solving tools.
Brainstorming, quality circles, and
Kepner-Tragoe methods have be-

come fashionable. They are good


for employee morale, and they are
good for small-increment problem
solving. However, their hunches
often lead to blind alleys, consuming
needless cost and time. Frequency
distributions and process capability
studies indicate the existence of variation, but like control charts, provide few clues as to the origin of the
problem. Engineering judgment and
varying one cause at a time can
sometimes help, but the longevity of
chronic problems bears witness to
their futility. Lastly, there is sometimes a child-like faith that computers can solve quality problems, but
used with ineffective methods they
only process meaningless data faster!
SPC Tools for the Japsnese Line
Workers
The Japanese, as indicated earlier, have abandoned SPC tools for
more powerful techniques, with two
exceptions. First, they still trot out
control charts as show-and-tell for
visiting American firemen, so that
their advanced methods would not
receive much scrutiny! Secondly,
they have trained their entire direct
labor work force in elementary SPC
tools, so that they can tackle smallstep quality problems through quality circles, Kaizen (improvement)
groups, and employee suggestions.'
These elementary tools consist
of:
P.D.C.A. (The Plan- Do- CheckAction Circle) allegedly taught by
Deming, but recently claimed as a
Japanese innovation
Data collection and analysis

12

Target
.. ~

~ .................................~~.~~~~~-"'""""""'."""'"'"~".""'."'."".""'.",""""""""

Design of Experiments
Characteristic

Classical

Taguchi

Shainin

Technique

Fractional factorials,
EVOP, etc.

Orthogonal arrays

Multi-vari,
factorials

Effectiveness

Moderate
(20 percent to 200 percent
improvement)

Low to moderate
(20 percent to 100 percent
improvement)

Retrogression possible

Retrogression likely

Extremely powerful
(100 percent to 500 percent
improvement)
No retrogression

High

High

Low

Many experiments

Many
experiments

Few experiments

Moderate

High

Low

Full ANOVA required

Inner and outer array multiplication, SIN, Anova

Experiments can be understood by line operators

Low

Poor

High

Higher order interaction


effects confounded with main
effects

No randomization

Every variable tested with all


levels of every other variable

Cost

Complexity

Statistical Validity

To a lesser extent, even


second order interaction
effects confounded

Applicability

Requires hardware
Main use in production

Ease of
Implementation

SIN concept good


Primary use as a substitute for
Monte Carlo analysis

search, full

Excellent separation and


qualification of main interaction effects

Requires hardware
Can be used as early as
prototype and engineering
run stage

Moderate

Difficult

Easy

Engineering and statistical


knowledge required

Engineers not likely to use


technique

Even line workers can


conduct experiments

Fig. 1. Three approaches

Check-sheets, histograms, and


frequency distributions to measure
variation
Pareto charts to separate the vital
few causes of defects from the
trivial many
Cause-and-effect diagrams to list
all possible causes of a problem
and group them by families
CEDAC (Cause-and-Effect Diagrams with the Addition of Cards)
- a refinement of cause and effect diagrams to keep them current and more meaningful to the
worker
Control charts.

Fall 1987

Even second order interaction


effects confounded with main
effects

vari~ble

to the design of experiments are compared.

These and allied techniques are


often grouped together and called
the "seven tools of QC" that every
Japanese worker leams and uses.
The aggregate benefit is very powerful, but unfortunately it is these
elementary SPC tools that U.S. professionals are embracing in their unequal drive to catch up with Japan.
Meanwhile, Japanese professionals
- in engineering, manufacturing,
and qual~y-have graduated to
more sophisticated and powerful
tools.
Japan's Secret Weapon
The central thrust and secret
weapon of Japanese quality is design of expariments (DOE). The ob-

jective is to disoover key variables in


product and process design, drastically reduce the variations they
cause, and opan up the tolerancas
on the lesser variables to reduce
oosts. These experiments are oonducted before the start of production. Hundreds of Japanese oompanies oonduct thousands of these
designed experiments each year to
make designs "more robust." The
discipline is a variant of the classical
design of experiments fathered by
Sir Ronald Fisher of Britain 70 years
ago. The principal Japanese exponent is Dr. Genichi Taguchi, who
adapted the classical methods into a
t>

13

system called the "Orthogonal


Array."
In the U.S., by contrast, DOE
was practically unheard of, except
for a small band of academics and
missionary statisticians. A belated
movement, started in the early
1980s, imported Taguchi's methods
wholesale.
Classical vs. Taguchl vs. Shainin
DOE
Fig. 1 lists three approaches to
the design of experiments - the
classical, Taguchi, and those taught
by Dorian Shainin, less famous than
Deming or Juran, but a giant of an
authority and a consummate problem solver. The classical tools start
with fraction factorials and end with
evolutionary optimization (EVOP).
Taguchi's methods use orthogonal
arrays (inner and outer), analysis of
variance, and signal-to-noise. The
Shainin experiments start with multivari charts, followed by variables
search and/or full factorials and end
with B vs. C validation. Fig. 1 compares each of these DOE approaches in terms of effectiveness, cost,
complexity, statistical validity, applicability, and ease of implementation.
While all three approaches are far
superior to conventional SPC techniques, the Shainin tools run circles
around the other two in almost
every characteristic. They can help
America leapfrog the Japanese in
their own game of concentration on
design quality over production quality. The tragedy is that most U.S.
companies are unaware of their existence, much less users of them!
Variation Is Evil
In Just-In-Time (JIT) practices,
inventory is evil. In quality, variation
is evil. There are two reasons. First,
getting a product within broad specification limits does not result in 100
percent customer satisfaction. Customers want uniformity, consistency.
Any departure from a desired design
center or target value results in customer dissatisfaction. The cost of
such dissatisfaction goes up exponentially as a product parameter
moves away from design center to
one end or the other of the specification limit. Secondly, and more im-

14

portantly, the only way to reduce


and eventually eliminate all scrap,
rework, analyzing, etc., is to get a
product parameter at, or very close
to, its design center. This is the only
way to get to zero defects and 100
percent yields. Once this is done,
production becomes a "breeze"
without the necessity of inspections
that add no value!
Measures of Variation: C p snd CPK
Before variation can be reduced, it must be measured. Two
yardsticks-Cp and CPK-have become standard terminologies in recent years. Process capability, Cp, is
defined as specification width divided by process width. It is a measure
of spread. Fig. 2 depicts six frequency distributions comparing the specification width (40 - 20 = 20) to the
process width. Process A in Fig. 2
has a C p of 0.8. This is a process
out of control, with reject tails at
both ends. This used to be the norm
for U.S. processes before the 1980
SPC age. Process B, with a Cp of
1.0 is barely in control. Any slight
change will cause rejects. At least
50 percent of U.S. processes have
not advanced beyond a Cp of 1.0.
Process C has a Cp of 1.33 and has
a margin of safety between the
tighter process limits and the specification limits. The Japanese used
this as a standard in the early
1980s. Process D, with a C p of 1.66
is better, while the more progressive
companies in the U.S. have established Process E, with a Cp of 2.0
(that is, the process width is only
half the specification width) as a
near-term goal. Process F, with a C p
of 8.0, is not only much better, it is
attainable. In fact, there is no limit to
a higher and higher Cp-even 10,
15, 20-as long as no recurring
costs are added to the product or
process and only the cost of design
of experiments-an investment,
rather than a cost - is incurred.
A better measure of variability
than Cp is C PK . It reduces Cp values
if the process is not centered. The
penalty is assessed by a K (for correction) factor. The formulas are:
C - Specification width (S)
pProcess width (P)
K = [Design Center (D) Process Average (x)]
S/2

CPK = (1 - K) C p
(Where K is an absolute value, without regard to its sign)
(Design Center, D, need not be at
the mid-point of the specification
width.)
When the process average, x,
and the design center, D, coincide,
K is reduced to zero, so that Cp and
CPK are equal. Fig. 3 shows two examples of non-centered processes.
Process A has a narrow spread,
with a respectable Cp of 2.5. But
because it is skewed toward one
end of the specification Iim~s, ~s CPK
is reduced to 1.0. By contrl;lSl, Process B has a wider spread and a
poorer Cp of 1.67. Yet, it is more
toward the design center and, therefore, has a higher C PK of 1.33. The
objective, again, is to aim for high
Cp.'s of 5, 10, 15, and 20 without
adding recurring costs.
Measuring Variation - The Easy
Way
Traditional process capability
studies require 50 units to be measured on the process, from which
the average, x, and the standard deviation(s) are calculated. The process limits are then conventionally
defined as 3s. (Sometimes,
where production is limited or measurements are difficult, 30 units can
suffice as a statistical minimum.)
From these measurements, Cp and
CPK can be calculated.
However, there is a statistically
sound short-cut to process capability studies. The rule is to measure
five units in a row taken from the
process. If all five units fall within
the middle half of the specification
limits, it can be proved, using the
binomial theorem, that a C PK of 2.00
and over, exists. If even one of the
five units falls outside the middle
half of the specification limits, the
process has excess variation, which
must be reduced using engineering
judgment, or preferably, the design
of experiments.

Control Charts- Putting the Csrt


Before the Horse
The systematic reduction of variance should begin at the prototype,
or engineering run, stage of product

Target

Capability Index Examples


C
SPEC. WIDTH (S)
P PROCESS WIDTH (P)

20

40

20

20

20
20

Cp~25 ~0.8

Cp~-~1.0

20

40

20

20
15

40

20

Cp~- ~1.33

Cp~-~1.66

12

20

40

20
10

40

20

40

20
2.5
FIg. 2. Six frequency distributions comparing the specification width to the process width.
Cp~- ~2.00

and process design. This is classical


prevention of defects over correction
later on at much greater expense.

The systematic reduction of


variance should begin at the
prototype, or engineering run,
stage of product and process
design. This is classical
prevention of defects over
correction later on at much
greater expense.
Nevertheless, since over 95 percent
of important U.S. product and process parameters have Cp,'s well

Fall 1987

Cp~- ~8.00

below 2.0, the techniques of variation reduction apply equally well to


current production.
The knee-jerk U.S. reaction to
variation reduction and problem
solving is to launch a program of
control charts. At best, control
charts will indicate that variation exists. But they cannot accurately
point to the cause of variation. Control chart trend analyses provide
clues comparatively as faint as the
light from stars beyond the Milky
Way!
Variation Reduction - A Proven
Roadmap
Fig. 4 is a time-tested roadmap
to variation reduction. It is largely an
accumulation of the DOE tools,
taught by Dorian Shainin. His philos-

ophy: "Talk to the parts, they are


smarter than the engineers'" What
he means is that parts contain much
information on the causes of variation that can be unlocked with appropriate statistically-deslgned experiments. The analogy of a
detective story is appropriate in this
diagnostic journey. Clues can be
gathered, each progressively more
pos~ive, until the culpr~ cause or
variable-the Red X in the Shainin
lexicon-is captured, reduced, and
controlled. The second most important cause is called the Pink X, the
third the Pale Pink X.

15

Variation Reduction

Process Capability (CPK '

Design of Experiments/Statistical Process Control:

Process A

A Roadmap

Prioritization of
Customer Requirements

10

12

14 15

LSL

(1)

20
USL

Monte Carlo Simulation

I--

Paired Comparisons

Process B

Multi-Attribute
Competitive Analysis

(2)

Multi-Vari Charts

I--

Components Search

X far from design center.

Market Research

(3)

Cp=2.5; C =1.0
PK
Despite narrow distribution, poor CR because

.-t

Direct Customer Inputs

Variables Search

Full Factorials

10 11
LSL

14
X

15
D

17

20
USL

Cp=1.67; Cp .1.33
K
Wider distribution than Process A, but closer
to design center, so acceptable Cp .
K

Fig. 3. Examples of non-centered


processes.

B vs. C

PATH 1
If mathematical relationships!
formulas between input variables
are known

Positrol

PATH 2
Prototype stage: If only two to
six units are available

PATH 3
At the eng. run, pilot run or full
production, with 30 or more
units available

Process Certification

Control Charts!
Pre-Control

FIll. 4. DOE tools taught by Dorian Shainin are reflected in this illustration.

.-

16

Target

Speclflcatlons- Arbitrary,
Overbellrlng, Wrong
The diagnostic journey begins
with a challenge to engineering
specifications that non-engineers assume are God-given. There are several reasons for poor specifications:
Customers are not consulted
-the difference between marketing and selling
The importance of a product feature to a customer versus the cost
of that feature is not value researched
The engineer's ego to create a
"state of the art" design
The engineer's reliance on prevIous designs, boilerplate requirements, or suppliers' published
specifications
Reliability is seldom a specification
In translating product specifications into component specifications, a reliance on formulas or
guesses, rather than design of experiments.
It is this author's experience that a
good percentage of both quality
problems and design cycle time can
be reduced if quality techniques are
used to determine specifications.
Some of these include: Value Research, Quality Function Deployment (a Japanese tool to translate
the "voice of the customer" into
hard specifications), Multi-Attribute
Analysis and, of course, DOE tools.
Computer Techniques
Route 1 in Fig. 4 uses Monte
Carlo simulation and other computer
methods to determine the effect of
independent variables upon a given
output. However, this is only useful
if the formula governing the respective variables is known and can be
programmed. More often than not,
these mathematical relationships are
not known, especially when second
order and higher order relationships
between the independent variables
can cause totally unexpected results. This is also one of the main
statistical weaknesses of the
Taguchi methods, since the orthogonal array uses fractional factorials,

Fall 1987

Multi-vari Chart
Bam

lOam

9am

11 am

12am

0.2510 f-----.---__+-----+----+-+-~-~--+-----

1\

I- \

~ ~

0.2500

Xl

1-\-l-----J.----\H---+l~~F+_~~~+_-J-V__\l_---l----:\~l__+\.......+~
~

I:
""

i5

...

.~~

0.2490 [--;:;:::=:::;:::::::;----+-t--4----1~----+_----

~Max.

ttr-!Min.

Right

0.24S0

p===t~~~___l_~~~--+~~~__+~~-

Fig. 5. Cyclical variations in a multi-vari chart,


which confound main and interaction
effects.
Multi-Veri Cherts-A Powerful
Firat Clue to the Red X
There are two prerequisites to a
multi-vari chart, the starting point of
the Shainin tools, (1) A Pareto chart,
to assure that the most frequent defect is being investigated, with the
highest pay-off and, (2) A causeand-effect diagram to identify all the
suspect causes of the problem.
However, a cause-and-effect diagram will only indicate the numerous
variables or causes, without any

... A cause-and-effect
diagram will only indicate the
numerous variables or causes,
without any due of where to
start or what interactions exist
between those variables. It
may also miss the boat
altogether. The real cause may
not even be on the list of
causes drawn on the diagram.
clue of where to start or what interactions may exist between those
variables. It may also miss the boat
altogether. The real cause may not
even be on the list of causes drawn
on the diagram.

The purpose of a multi-vari


chart is to reduce the very large
number of variables (up to 100) to a
few prime suspects that can then be
investigated using variables search,
if five or more variables remain, or
full factorials if four or fewer remain.
A multi-vari chart is a stratified experiment, where the objective is to
determine whether the major variation pattern is positional, cyclical, or
temporal. If the greatest variation is
temporal, there is less need to look
at the other two types of variation.
Examples in each pattern of variation are:
Positional

Variation within a single unit (such


as porosity in a metal casting) or
across a single unit, with many
parts (such as a P.C, board with
many components)
Variations by location in a batch
loading process (such as cavnyto-cavny variations in a mold
press)
Machine-ta-machine; operator-tooperator; or plant-to-plant variation.
Cyclical
Variation among consecutive units
drawn from a process

17

Variation among groups of units


Batch-to-batch variations
Lot-to-Iot variations.

Temporal
Hour-to-hour; shift-to-shift; day-today; week-to-week or month-tomonth variations.
Multl-Varl Case Study: Rotor
Shaft
A manufacturer, producing cylindrical rotor shafts, with a diameter
requirement of 0.250 in. 0.001 in.,
was experiencing excessive scrap.
A process capability study indicated
a spread of 0.0025 in., against the
requirement of 0.002 in., that is, a
C PK of 0.8. The foreman was ready
to junk the old turret lathe and buy a
new one, for $70,000, that could
hold a tolerance of 0.0008 in., that
is, a C PK of 1.25. The plant manager
directed that a multi-vari study be
conducted before the purchase of
the new lathe, even though the payback period for the new lathe would
be only nine months.
Fig. 5 shows the resu It of a
multi-vari chart. The positional (within each shaft) variations describe
taper changes, from the left side of
the shaft to the right, and out-ofround conditions (maximum diameter and minimum diameter) on each
side of the shaft. The cyclical variations, from one shaft to the next, are
shown by the thin connecting lines
between each shaft. The temporal
variations, from one time period to
the next are also shown. Analysis of
each variation indicated:

Time-to-time variation: 60 percent of


the allowed tolerances
Unit-to-unit variation: 5 percent of
the allowed tolerance
Within unit variation:
(a) taper: 15 percent of the
allowed tolerance
(b) out-of-round: 30 percent of
the allowed tolerance.
Total variation: 110 percent of the
allowed tolerance.
One of the rules of a multi-vari chart
is to continue the plot until at least
80 percent of the out-of-control variation is captured. In this case, by 12

18

noon, 110 percent of the allowed


variation and 88 percent of the actual variation had been accounted for.
In analyzing these three stratifications of variation, the time-to-time
variation is the largest, with the
greatest change occurring between
10 am and 11 am. This provided a
strong possible clue-a coffee
break, with a cooling down of the
lathe. The next reading at 11 am
almost repeated the 8 am reading at
the start of the day. This led to
temperature as a potential Red X.
The foreman discovered, to his embarrassment, that the amount of
coolant in the lathe tank was low.
When the coolant was added to the
prescribed level, the time-to-time
variation was reduced to an inconsequential figure.
The unit-to-unit variation, at 5
percent of the total allowed, was not
worth investigating. However, the
within-unit positional variation
showed a total variation of 45 percent (15 percent + 30 percent) of
the allowed tolerance. The variations
in taper indicated a non-random pattern, with the left side always higher
than the right side. This led to the
conclusion that the cutting tool was
not parallel to the axis of the rotor
shaft. A slight adjustment in the setting reduced taper to almost zero.
Finally, the out-of-round condition was traced to a worn bearing
guiding the chuck axis. New bearings were installed for a total cost of
$200, including labor.
In summary, the total variation
was reduced from 0.0025 in. down
to 0.0004 in. or a C PK of 0.0021
0.0004 = 5.0! The benefits: zero
scrap and a cost avoidance, in retaining the old machine, of almost
$70,000.
The multi-vari chart is an example of the power of the Shainin
tools. They are simple (no complex
mathematical or statistical formulas).
They are quick (in this example, a
snapshot of the process in just four
hours). They are cost effective (in
this case, a cost of $200 for a savings of $70,000 along with zero
scrap). Finally, they are powerful in
getting at the root-cause of the
problem (here, the coolant, bearings, and cutting-tool setting are the

Red X, Pink X, and Pale Pink X


respectively).
Other DOE Tools
The multi-vari chart reduces a
large number (even up to 100) of
suspect variables to a narrow family
of variables - 20 or less. Other
DOE tools such as Components
Search and Paired Comparisons
can also be used to reduce hundreds of suspect causes down to a
few. Variables Search is the Rolls
Royce of DOE techniques that can
slug it out with the Taguchi methodology and win every time! It is used
when there are five or more suspect
variables, following a multi-vari
"homing in" investigation. The full
factorial is used when there are four
or fewer suspect variables left to investigate. Both tools will determine
the contribution of each factor, or
cause, to the total variation, as well
as clearly separate the main effects
from the interaction effects - a key
requirement that fraction factorials or
Taguchi's orthogonal arrays, with
their "saturated designs," cannot
achieve.
SPC Tools: The Tall thaI's been
Wagging the DOE Dog
It is only when the DOE diagnostic tools have pinpointed the
major variables and drastically reduced their variability through redesign, manufacturing process control, or supplier process control that
SPC can begin its mission - to assume that variation, once reduced
through DOE, is maintained at that
reduced level.
Two Pre-SPC Tools: Posltrol and
Process certification
Even within this limited scope of
SPC, control charts-or the preferred technique of precontrol- are
not the starting point. The process
producing the product must be directly controlled. One of the weaknesses of American industry is that
we attempt to control a process by
examining the product it produces.
That is too late. Key process parameters must first be separated from
less important process parameters
using the same DOE tools. Next,
these key process parameters must

Target

The Simple Mechanics of Pre.control

P-C Line

Simple Pre-control Rules

p-c Line

1. Draw two pre-control (P-C) lines in middle


half of spec. width.

2. :0 determine process capability, five units


In

a row must be within P-C lines (green

zone).

If not, use diagnostic tools to reduce


variation.

3. In ~roduction, periodically sample two


units consecutively.

Condition

12
14
86%

Action

1. Two units in Green Zone

Continue

2. One unit in Green and one

Continue

unit in Yellow
3. Two units in Yellow

1
14
7%

4. One unit in Red

Slop
Slop

Red
l<-"7'qYe~lI!Qlowti..l._ Green Zone --of""Y~.I~'O~~..I Red
4. Frequency of sampling:
_
__z_o--,n.;crt'=--~z~on~.~=:::::====~zosn~e
_~~z~o~n.=---__ Divide the time interval between two
30"
LSL= LPL

1.50"

Target Area

1.S<J

30"

stoppages by six.

USL.UPL

Fig. 6. Application of pre-control with four rules.

be monitored through a discipline


called Positrol. It determines, for
each key process parameter, who,
where, how, and when it is measured. A Positrol log is then kept of
such measurements, which can be
readily monitored for conformance
by an auditing activity.
Another important pre-SPC tool
is Process Certification. Its foundation is Murphy's Law: "If anything
can go wrong, it will!" At any work
station, a number of peripheral issues can make or break quality, be
Sides the major areas of design,
process, and materials. These include: operator goals, instruction
training; environmental factors s~ch
as temperature, humidity, dust, gas
control, water cleanliness electrostatic discharge; equipme'nt and
gage calibration; work layout, etc.
These issues have to be listed at
each work station, generally by the
process engineer. An interdisciplinary team then examines each item
on the checklist to assure that the
necessary disciplines are in place to
prevent even the random occurrence of poor quality. At that point,
the work station is process-certified.

Fall 1987

To guard against retrogression the


work stations should be periodically
re-certified.

The l'(rannlcal Use of Control


Charts
It has now been shown that
contr~l charts are the last step in
vanatlon reduction and control not
the first. But even here, controi
charts can be substituted with an
easier, simpler, and more cost
effective technique called precontrol.
When developed by Dr. Walter
Shewhart 60 years ago, control
charts were useful. So was the
Model T. But both have outlived
their glamour and value today. Unfortunately, a number of OEM customers increasingly demand the use
of control charts as a passport to
their business. They force control
charts down the throats of unknowing and unwilling suppliers, and they
bludgeon into submission those
knowledgeable suppliers who dare
to point out that the control chart
emperor wears no clothes.

Pre-Control- The Elegance of


Simplicity
Since control charts are widely
known, they will not be explained in

this arti~le. On the other hand, precontrol IS only now coming into the
consciousness of quality practitioners and needs an explanation of at
least its mechanics. Fig. 6 shows
the application of pre-control with
four simple rules to: (1) establish
pre-control lines; (2) determine process capability at the start of production; (3) monitor production on an
on-going basis and (4) determine
the frequency of sampling.
The simplicity of pre-eontrol is
now obvious. Its mechanics can be
taught to managers and blue-collar
workers alike in five minutes. Even
the least sophisticated line or machine operator can use it and more
important, make quick adjustment to
assure defect-free' production on
thousands upon thousands of units.
~hile there is no need for charting
In pre-control, plots of the two unit
samples versus time can be maintained as a record. From such plots,
frequency distributions or Cp 's can
be derived easily. Pre-contr01 can
also be used for tightened specifications (relative to broader customer
~pecification for one-sided specifications); or for attributes (by weighing

I>

19

The Advantages of Pre-Control Over Control Charts


Characteristic
1. Simplicity

Control Charts
Complex-calculations of control

Pre-Control
Simple~pre-controlare

middle half of spec.

width

unclear

Easy-green and yellow zones, a practical


approach for all workers

3. Mathematical

Involved~X, R, control limits and process


limits must be calculated

Elementary-must only know how to divide by


four

4. Small production

Useless for production runs below 500 unitssampling of 80-150 units before even trial limits
can be established

Can be used for production runs above 20


units; pre-control lines pre-determined by
specs (which can be narrowed)

5. Re-calibration of
control limits

Frequent-no such thing in industry as a

None needed, unless specs "goal posts" are

constant cause system

moved inward

6. Machine adjust-

Time consuming-any adjustment requires


another trial run of 80-150 units

Instant-based on two units

7. Frequency of
'sampling

Vague, arbitrary

Simple rule: Six samplings between twc


stoppages/adjustments

8. Discriminating

Weak- a risk of rejection by chart, when there


are no rejects, is high. ~ risk of acceptance by
chart (in control), when there are rejects, is
high

Excellent-a risk of rejection by pre-control is


low, <2 percent under worst conditions; 0 wit~
CPK of 1.66. ~ risk <1.36 percent under worst
conttitions; 0 percent with C PKof 1.66

2. Use by operators

runs

ments

power

Difficult- charting mandatory, interpretation

Little relationship to specs


9. Attribute charts

10. Economy

P, C charts do not distinguish between defect


mode types or importance

Attribute charts can be converted to precontrol charts by weighting defect modes and
an arbitrany rating scale

Expensive-calculations, papenwork, larger


samples, more freauent samoling long trial

Inexpensive-calculations simple, minimal


papenwork, small sampies, infrequent sampling
if quality is good, process capability determined by just five units

runs

Fig. 7. Control charts and pre-control compared.

attribute defects in terms of importance) and by converting them to a


variables scale, say from 1 to 10.
Fig. 7 compares the many
weaknesses of control charts and
the strengths of pre-control.

The March to World Class


Competitiveness
In conclusion, a company that
seeks world class status must place
quality at center-stage as a superordinate, sacred value. Within quality,
the most important task is to reduce
variation drastically. This is best
done with the design of experiments. SPC can then be used for
"maintenance" purposes, with pre-

control a far more effective tool than


control charts. The prescription for
restoring American industry to world
class is straight forward. Without it,
we cannot succeed. With it, we will
not fail!
'The number of suggestions turned in by Japanese workers is legendary. Whereas the
average number of suggestions per employee per year in the U.S. is 0.1, the figure of
Japan is 10. More important, over 80 percent of these suggestions are approved by
Japanese management. The quality circles
experiment with their own ideas, try pilot
runs, and submit the suggestions to management when they are sure of success.
Management approval then becomes almost
automatic.

ence transactions and in trade journals. It

can be shown that the a (alpha)


- producer's - risk cannot exceed 2 percent

and the

IJ (beta)-consumer's risk generally

will not exceed 1.36 percent. However, if the


process width is only half of the specification
width and the process is centered-a C PK of

2.0-the a and

IJ risks are reduced to zero!

Even with a C PK of 1.33, these risks are less


than 10 parts per million.

Author:
Keki A. Bhote is senior corporate consultant, quality and productivity improvement, Motorola, Inc., Schaumburg,
IL. He is the author of Supply Management, a management briefing published
by American Management Association
Membership Publication Division, 1987.

2For those interested in the theory of precontrol, there is a considerable body of literature available in A.S.Q.C. annual confer-

20

Target
'''''''''''''''''''''''''''''''',"'J_}~~.",!,'

You might also like