Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 13

Statistical Quality Control and Problems solving

(Studying. Arwa Mineral Water Co.)

Chapter3
Contents
3.1 introduction
3.2 Statistical techniques in QC
3.3 Steps of Quality control process
3.4 Recycling (QC in production level)
3.5 Practical limitations
3.6 Numerical analysis
3.7 Normal Process Capability (Sixpack)
3.8 Binomial Capability Analysis
3.9 Between/Within Capability Analysis
3.10 Problem Solving Techniques
3.1 Introduction
Quality control is integral part of the Mintel water Production, and the efficient systems most critical in the
mass production. In this study, the quality control system implemented in Arwa Mineral Water Co. is
discussed here. The processes in the Arwa Industry are discussed with respect to the quality control
procedures and parallel with the quality inspection plan. The standards tests involved, and the issues
pertaining productivity and quality of Mintel water are reviewed.

3.2 Statistical techniques in QC

Statistical process control (SPC) is one method of statistical quality control, and its content control charts
(Variables and Attributes). The Fig.3.1 discuses some statistical tools that use in quality control as shown
below. In this report we are using some of them to check the quality of Arwa industry products.

Statistical Quality Control tools

Tools For
Tools For Organization Determine
Tools Analysis And Check And Analysis The problems
And Sulution

Pareto Flow Statistical


Fishbone Check Sheet Scatter Histograms
Diagram Process
diagram Diagrams
Diagram Control Charts

Fig.3.1 QC. Tools

3.3 Steps of Quality control process


Quality control process is start from design the product passes through the production producers to
customer. The list of these levels as below: -
1) Estimate the quality control characteristics (Design level).
In this level we are determine the types of characteristics and the most important characteristics, and we can
do this by some statistical methods as Pareto diagram or flow diagram.
2) Measure characteristics of product (Production level).
This is done by take samples from protection line, and applied the required measurements.
3) Evaluation and Analysis the variation between the result values and take the solution (Evaluation and
Solution Level).
This is done by quality control chart, to estimate the causes.
Here we will discuss in details the production level in the next sessions.
3.4 Recycling (QC in production level)
This is the important level of QC process in Mintel water production because the water is the heart of
product. So we are considering to study the QC of
water and the important measure characteristics of Water bottling
Filtering
product. First we are explaining the stages of Sourse and test
production starting from elicitation water from
well to stockpiling the product.
labeling storage
As shown in the Fig.3.2 the water pass through 5
stages start from elicitation water, in this stage the
engineer tests the source of water by using some
tools Fig.3.2 Product cycling and rules to chuck the water health with
standards specifications. The second stage is filtering, in this stage
Fig 1.2 Production stages.
All this stages are completed under QC conditions.
Arwa Mineral Water Co. contents 3 units to make the same product, this reason to increase the production output.
The unit can be summarize as shown in the Fig.3.3 below.
Fig.3.3 Unit in Arwa Mineral Water Co.

3.5 Practical limitations: -


Mintel water industries have many practical limitations. The drinking water has specific Mintel salts amount.
This specific Mintel salt amount is given from global specification and measurements, the
table of this specification in the Table 3.1. The industry cannot change this amounts of this
salts, but we can addition or reducing some of them.
Ex. We reduce the iron in water to increasing the life of water. And we are take the most
important specification by using some statistical quality control method as (Pareto Diagram).

Table3.1

3.6 Numerical analysis: -


First we are measure the quality of water by the measure the amount of the salt in water and compere it’s
with specification. The most important specification of Mintel water is: - HP, Ca, Mg, K, Fe, Mg, Na, F, Cl,
NO4, CO4.
1) First characteristic we are studying is HP. The HP very important property. And the data of this property
can be summarization in Table 3.2 as shown below:
Interpret the results
The Xbar and R-charts indicate that the
process is stable, with no points beyond the
control limits. The 5 Subgroups plot indicates
the data are not randomly and not
symmetrically distributed around the process
mean. The normal probability plot indicates
that data are not normally distributed.
Therefore, the assumptions for normal
capability analysis are not satisfied and the
capability of the process cannot be analyzed.

Table 3.2
The normal probability plot and control charts of this property it’s done by Minitab program as shown
Fig.3.3 and 3.4 respectively.
Fig.3.4 Normal probability plot Fig.3.5 Xbar-R Chart

2) Second characteristic we are studying is Ca. And the data of this property can be summarization in Table
3.3 as shown below.
Interpret the results
The Xbar and R-charts indicate that the process is
not stable, with one points beyond the control
limits. The 5 Subgroups plot indicates the data are
not randomly and not symmetrically distributed
around the process mean. The normal probability
plot indicates that data are normally distributed.
Therefore, the assumptions for normal capability
analysis are satisfied and the capability of the
process can be analyzed.

Table 3.3
The control charts of this property it’s done by Minitab program as shown figure 3.4.

Fig.3.6 Normal probability plot


Fig.3.7 Xbar-R Char
3) Third characteristic we are studying is TDS. The data of this property can be summarization in Table 1.4
as shown below.
Interpret the results
The Xbar and R-charts indicate that the process is
stable, with no points beyond the control limits.
The 5 Subgroups plot indicates the data are not
randomly and not symmetrically distributed
around the process mean. The normal probability
plot indicates that data are not normally
distributed. Therefore, the assumptions for
normal capability analysis are not satisfied and
the capability of the process cannot be analyzed.

Table 3.4

The normal probability plot and control charts of this property it’s done by Minitab program as shown
Fig.3.8 and 3.9 respectively.

Fig.3.8 Normal probability plot Fig.3.9 Xbar-R Char


3.7 Normal Process Capability (Sixpack)
Use Normal Capability Sixpack to assess the assumptions for normal capability analysis and to evaluate only the
major indices of process capability. Using this analysis, you can do the following:
1. Determine whether the process is stable and in control
2. Determine whether the data follow a normal distribution
3. Estimate overall capability (Pp, Ppk) and potential capability (Cp, Cpk)
To perform the analysis, you must specify a lower or upper specification limit (or both) to define your process
requirements. The analysis evaluates the spread of the process data in relation to the specification limits. When a
process is capable, the process spread is smaller than the specification spread. The analysis can also indicate
whether your process is centered and on target. In addition, the analysis estimates the proportion of product that
does not meet specifications.
For example, a quality analyst wants to evaluate the capability of a bolt manufacturing process. To satisfy customer
requirements, the thread length of the bolts should be within 0.5 mm of the target of 20 mm. The analyst uses
normal capability sixpack to quickly assess main capability indices and evaluate the assumptions for normal
capability analysis.
Where to find this analysis in Minitab
To perform normal capability sixpack, choose Stat > Quality Tools > Capability Sixpack > Normal.
When to use an alternate analysis
1. If you want to evaluate a more complete set of capability measures, in addition to the major indices, use
Normal Capability Analysis.
2. If you want to evaluate the assumptions for using nonnormal capability analysis, use Nonnormal Capability
Sixpack.
3. If your process produces large variation between subgroups, such as a batch process, use Between/Within
Capability Sixpack to assess the assumptions for a between/within capability analysis.

Data considerations for Normal Capability Sixpack


To ensure that your results are valid, consider the following guidelines when you collect data, perform the analysis,
and interpret your results. The data should be continuous. Continuous data are measurements that may potentially
take on any numeric value within a range of values along a continuous scale, including fractional or decimal
values. Common examples include measurements such as length, weight, and temperature.
If you have attribute data, such as counts of defectives or defects, use Binomial Capability Analysis or Poisson
Capability Analysis.
Collect enough data to obtain reliable estimates of process capability
Try to collect at least 100 total data points (subgroup size*number of subgroups), such as 25 subgroups of size 4, or
35 subgroups of size 3. If you do not collect a sufficient amount of data over a long enough period of time, the data
may not accurately represent different sources of process variation and the estimates may not indicate the true
capability of your process.
Data should be collected in rational subgroups, if possible A rational subgroup is a small sample of similar items
that are produced over a short period of time and that are representative of the process. Observations for each
subgroup should be collected under the same inputs and conditions, such as personnel, environment, or equipment.
If you do not collect rational subgroups, the variation in the subgroups may reflect special causes rather than the
natural, inherent variation of the process.
The process must be stable and in control. If the current process is not stable, then the capability indices cannot be
reliably used to assess the future, ongoing capability of the process. Use the control charts in the capability sixpack
output to determine whether the process is stable and in control.
The data should follow a normal distribution. The process capability estimates for this analysis are based on the
normal distribution. If the data are not normally distributed, the capability estimates will not be accurate for your
process. If your data are nonnormal, you can transform them using the Box-Cox transformation or the Johnson
transformation, which are included in the Transform options of this analysis. Use the normal probability plot and
the histogram in the capability sixpack output to determine whether the data (or the transformed data) follow a
normal distribution.
Now, we find the Normal Process Capability of the size of water inside the product.
The data collected as shown below.

Interpret the results


The Xbar and R-charts indicate that the process is stable, with no points beyond the control limits. The Last
25 Subgroups plot indicates the data are randomly and symmetrically distributed around the process mean. The
normal probability plot indicates that data are normally distributed. Therefore, the assumptions for normal
capability analysis are satisfied and the capability of the process can be analyzed.
The histogram and capability indices indicate that the process is approximately centered on the target and the
measurements are within the specification limits. The capability indices Cpk, Ppk, and Cpm are all greater than
1.33, which is a generally accepted minimum value for a capable process. Therefore, the engineers conclude that
the bottling process satisfies customer requirements for the size of the product.
3.8 Binomial Capability Analysis
Use Binomial Capability Analysis to determine whether the percentage of defective items meets customer
requirements. Use when each item is classified into one of two categories, such as pass or fail. Using this analysis,
you can do the following:
1. Determine whether the process is in control.
2. Estimate the percentage of defective items for each sample and across all samples (%defective).
3. Assess whether the percentage of defectives is stable.
For example, the manager of a customer call center wants to analyze how capable the representatives are in
answering phone calls before the calls are redirected to voice mail. The manager uses binomial capability analysis
to determine whether the rate of calls that are unanswered then redirected is stable and below 20%.
Where to find this analysis
To perform binomial capability analysis, choose Stat > Quality Tools > Capability Analysis > Binomial.
When to use an alternate analysis
If you have counts of the number of defects on each item, use Poisson Capability Analysis to evaluate the defects
per unit.
To ensure that your results are valid, consider the following guidelines when you collect data, perform the
analysis, and interpret your results.
1. The data must be counts of defective items
2. Each item in a subgroup must be classified as either acceptable or not acceptable (defective). If your data are not
counts of defectives, you cannot estimate the capability of your process using the binomial distribution. If you have
counts of the number of defects on each item, use Poisson Capability Analysis to evaluate the defects per unit.
3. Collect data in subgroups. Subgroup is a collection of similar items that are representative of the output
from the process you want to evaluate. The items in each subgroup should be collected under the same process
conditions, such as personnel, equipment, suppliers, or environment. If you do not collect data in subgroups under
the same process conditions, the variation in the subgroups may reflect special causes rather than the natural,
inherent variation of the process.
4.Collect enough subgroups to obtain reliable estimates of process capability.
5.Try to collect at least 25 subgroups. If you do not collect a sufficient amount of data over a long enough period of
time, the data may not accurately represent different sources of process variation and the estimates may not
indicate the true capability of your process.
6.The subgroups must be large enough
7. The average proportion of defective times the subgroup size should be at least 0.5 for all subgroups. If the
subgroup size is not large enough, the control limits may not be reliable when they are estimated from the data.
8.The subgroups sizes can be unequal.
9.Subgroups can vary in size. For example, if a call center tracks 100 incoming calls each hour and counts the
number of unsatisfactory wait times, then all the subgroup sizes are 100. However, if the call center tracks all the
incoming calls during a randomly selected hour of the day, then the number of calls is likely to vary and cause the
subgroup sizes to be unequal.
10.The process must be stable and in control
If the current process is not stable, then the capability indices cannot be reliably used to assess the future,
ongoing capability of the process. Use the P chart from the binomial capability analysis output to determine
whether the process is stable and in control. Investigate out-of-control points and eliminate any special-cause
variation in your process before you evaluate the process capability.
The supervisor for a call center wants to evaluate the process for answering customer phone calls. The supervisor
records the total number of incoming calls and the number of unanswered calls for 21 days.
The supervisor performs binomial capability analysis to evaluate how well the process for answering calls meets
specifications.
Open the sample data, product_defect.MTW.
Choose Stat > Quality Tools > Capability Analysis > Binomial.
In Defectives, enter Unanswered Calls.
Under Sample size, select Use sizes in and enter Total Calls.
Click OK.
Interpret the results
On the Rate of Defectives plot, the points appear to be randomly distributed across the various sample sizes, so the
supervisor can assume that sample size does not affect the rate of defectives. The P chart and the Cumulative
%Defective plot indicate that the %defective is fairly stable for this process. Therefore, the assumptions for the
capability analysis appear to be satisfied.
In the Summary Stats table, the parts per million defectives (PPM Def) indicates that 95,657 calls out
1,000,000 are expected to be unanswered (defective). This PPM value corresponds to a %defective of
approximately 9.57%. The upper and lower confidence limits (CI) indicate that the supervisor can be 95%
confident that the %defective for the process is contained within the interval 8.79% and 10.39%. The Process Z
value of 1.3 is lower than 2, which is often considered the minimum required value for a capability process.
Together, these summary statistics indicate that the call center is not capable of meeting specifications. A high
percentage of calls are unanswered. The supervisor needs to determine why so many calls are unanswered and how
to improve the process.

3.9 Between/Within Capability Analysis


Use Between/Within Capability Analysis to evaluate the capability of your process based on a normal
distribution when your process naturally produces systemic variation between subgroups, such as a batch process.
Using this analysis, you can do the following:
1. Determine whether the process is capable of producing output that meets customer requirements.
2. Compare the overall capability of the process with its between/within capability to assess opportunity for
improvement.
To perform the analysis, you must specify a lower or upper specification limit (or both) to define your
process requirements. The analysis evaluates the spread of the process data in relation to the specification limits.
When a process is capable, the process spread is smaller than the specification spread. The analysis can also
indicate whether your process is centered and on target. In addition, it estimates the proportion of product that does
not meet specifications.
For example, an engineer wants to assess the process capability of a paper coating process considering both
within-subgroup variation and between-subgroup variation. The within-subgroup variation is the variation of
coating thickness within one roll. The between-subgroup variation is the variation between rolls that may occur
during machine resets. The engineer performs a between/within capability analysis to evaluate how well the
coating thickness meets the customer requirements of 50 ± 3 microns.
An engineer wants to assess the capability of a process that coats large rolls of paper with a thin film. The
engineer collects three measurements of coating thickness from 25 consecutive rolls. Because the machine is reset
for each new roll, the engineer needs to consider variation between rolls in addition to variation within a roll. The
film thickness must be 50 ± 3 microns to meet engineering specifications.
The engineer performs between/within capability analysis to evaluate how well the coating thickness meets
the customer requirements of 50 ± 3 microns.

Open the sample data, lab.MTW.


Choose Stat > Quality Tools > Capability Analysis > Between/Within.
In Single column, enter Coating.
In Subgroup size, enter Roll.
In Lower spec, enter 47.
In Upper spec, enter 53.
Click Options.
In Target (adds Cpm to table), enter 50.
Click OK in each dialog box.

Interpret the results


The process is
approximately centered and all
observed
measurements are within the
specification limits. For
between/within capability, Cp is
1.26, which indicates that the
specification spread is 1.26
times greater than the 6- σ spread
in the process. Cp (1.26) and Cpk
(1.21) are very close to each
other, indicating that the process
is approximately centered. For
overall capability, Pp
(1.19), Ppk (1.15), and Cpm
(1.18) are very close to each
other, indicating that the process
is centered and on target.
However, Ppk is slightly less than
1.33, which is a generally accepted
minimum value for a capable process. The engineer concludes that the process is almost capable of applying
coating that conforms to specifications; however, its capability could be improved.

3.10 Problem Solving Techniques


The Pareto chart analysis is a statistical graphical technique used to map and rank business process problems
starting from the most frequent to the least frequent with the ultimate goal of focusing efforts on the factors that
produce the greatest impact overall. To do this effectively, it utilizes the Pareto Principle, which is most
predominantly known as the 80/20 rule.
The Pareto chart is sometimes also referred to as the Pareto analysis or
Pareto diagram. If you hear those terms anywhere else, just know that they are
almost interchangeable. In this article, we will show you how and when to use a
Pareto chart analysis using Microsoft Excel and Minitab. Mastering Pareto
diagrams will help you maximize the efficiency of your business processes.
The Pareto chart is derived from the Pareto principle, which was suggested
by a Romanian-born American management consultant, Joseph Juran, during
WWII. The name of principle, however, is derived from the Italian economist
Vilfredo Pareto.
In a nutshell, Vilfredo Pareto noticed throughout extensive observations
and research that during the late 1890s in Italy, 80% of the wealth and land
belonged to roughly 20% of the population.
Such a relationship is so universal that it can be applied to almost any field. Here it is shown by the figure
below: with a Pareto chart, there are always two variables at play which share the same ratio (80/20) in all cases.

One of the greatest uses for the Pareto chart analysis is total quality control. It is used as a tool within the Six
Sigma framework, a mathematical method for tracking company performance. Pareto chart analysis visually
displays the data so as to make it easier to judge whether the Pareto Principle can be applied to the data.
Pareto ordering is used to guide corrective action and to help the project team take steps to fix the problems
that are causing the greatest number of defects first.
3.10.1 Pareto Analysis
Here are eight steps to identifying the principal causes you should focus on, using Pareto Analysis:
Step1. Create a vertical bar chart with causes on the x-axis and count (number of occurrences) on the y-axis.
Step2. Arrange the bar chart in descending order of cause importance that is, the cause with the highest count first.
Step3. Calculate the cumulative count for each cause in descending order.
Step4. Calculate the cumulative count percentage for each cause in descending order. Percentage calculation:
{Individual Cause Count} / {Total Causes Count} *100
Step5. Create a second y-axis with percentages descending in increments of 10 from 100% to 0%.
Step6. Plot the cumulative count percentage of each cause on the x-axis.
Step7. Join the points to form a curve.
Step8.Draw a line at 80% on the y-axis running parallel to the x-axis. Then drop the line at the point of intersection
with the curve on the x-axis. This point on the x-axis separates the important causes on the left (vital few) from the
less important causes on the right (trivial many).
Here is a Pareto diagram of some problems that’s occurred in Arwa Mineral Water Co., using sample data
showing the relative frequency of causes for errors. From the table we are show the reasons for return and its
frequency. By using Minitab program to create the Pareto chart as shown in fig . Pareto chart enable us to see what
20% of cases are causing 80% of the problems and where efforts should be focused to achieve the greatest
improvement. In this case, we can see that broken links, spelling errors and missing title tags should be the focus.

The value of the Pareto Principle for a project


manager is that it reminds you to focus on the 20%
of things that matter. Of the things you do for your
project, only 20% are crucial. That 20% produces
80% of your results. Identify, and focus on those
things first, but don't entirely ignore the remaining
80% of the causes.

You might also like