Professional Documents
Culture Documents
Attribute Control Chart: DR C K Biswas Prof. Dept of ME
Attribute Control Chart: DR C K Biswas Prof. Dept of ME
Dr C K Biswas
Prof.
Dept of ME
1 PCAS
Content
Variables vs attributes
p chart
np chart
c chart
u chart
Summary
2 PCAS
Types of control charts
3 PCAS
Variables vs attributes
Variables Attribute
Characteristics measurable countable
continuous discrete units or occurrences
May derive from counting good/bad
4 PCAS
p chart - Problem statement
An optical CD has many quality attributes
which are necessary to be up to the mark for
being used by customer. The CD
manufacturer is plagued with lot of non
conforming CDs during the manufacturing
process. It is suggested to use control charts
to statistically control the process. Data for
defective CDs from 20 samples (sample size
n = 100) are shown in the next slide.
Use appropriate control chart and recommend
control limits.
5 PCAS
Data sheet
calculate
1 4 .04 11 6 .06
2 3 .03 12 5 .05
3 3 .03 13 4 .04
4 5 .05 14 5 .05
5 6 .06 15 4 .04
6 5 .05 16 7 .07
7 2 .02 17 6 .06
8 3 .03 18 8 .08
9 5 .05 19 6 .06
6 PCAS 10 6 .06 20 8 .08
UCL & LCL calculation
The distribution of fraction non-conforming, pi
is Binomial distribution.
̅
mean = ̅ SD
The data points that there is an increasing trend in the average proportion
defectives beyond sample no. 15 also, data show cyclic pattern.
Process appears to be out of control and also there is a strong evidence
that
8 dataPCAS
are not from independent source.
Revised control charts
9 PCAS
No of
Subgroup defectives,
np control chart no Di
1 10
2 9
The data collected from a production unit of ICs 3 10
4 14
is given in table. If sample size, n=42 was used
5 4
for all subgroups, compute the control limit for 6 11
the np chart. 7 9
8 8
The distribution of defective, np is Binomial 9 12
10 12
distribution with
11 9
mean = ̅ SD ̅ 1 ̅ 12 14
13 8
!" 14 9
Mean defective, n ̅ ∑ 10.32 15 11
!
.$ 16 17
So, ̅ 0.2457
% 17 13
18 5
̅ 3 ̅ 1 ̅ =18.69 19 10
UCL
20 9
Center line ̅ = 10.32 21 12
LCL ̅ 3 ̅ 1 ̅ =1.95 22 10
(≥0) 23 8
10 PCAS 24 13
25 11
The process is stable
11 PCAS
c control chart
c-chart is used to monitor nonconformities per unit.
It is also occasionally used to monitor the total number
of events occurring in a given unit of time.
Example
Monitoring the number of voids per inspection unit
in injection molding or casting processes
Monitoring the number of discrete components that must be
re-soldered per printed circuit board
Monitoring the number of product returns per day
12 PCAS
Wafer No. of Wafer No. of
No. Defects No. Defects
Problem statement 1 16 14 16
2 14 15 15
13 14
UCL (̅ 3 (̅ 16 3 16 28
CL = (̅ 16 sum 205 195
LCL (̅ 3 (̅ 16 3 16 4
13 PCAS
The 24th point is OOC and the 3rd point is very near UCL.
14 PCAS
Revised control limits
It is assumed that all assignable cause(s) were detected and
removed.
Dropping the 24th point, revised control limits were calculated.
It was found that 3rd point is OOC.
Revised control limits were calculated by Dropping the 24th
and 3rd points as given below:
UCL = 26.3775
CL = 14.8261
LCL = 3.27469
15 PCAS
u control chart
,-
Average no. of defects per unit, + for each sample no.
-
The distribution of u is Poisson distribution with mean,
total number of defects
+.
total number of samples
0
Standard deviation will vary for each sample no. /
-
16 PCAS
sample sample Defect per
unit, + CBC ABC
no, i size, defects, (
Problem statement 0.238 0.108 0.695
1 42 10
A paper mill uses a control chart 0.368
2 87 32 0.198 0.606
to monitor the imperfections on 3 53 10 0.189 0.141 0.663
finished rolls of paper. The 4 70 14 0.200 0.174 0.629
results of 25 days inspection 5 101 37 0.366 0.212 0.591
results are shown in the table. 6 91 29 0.319 0.202 0.601
Comment on the stability of the 7 36 11 0.306 0.085 0.719
process. 8 83 38 0.458 0.193 0.610
9 64 23 0.359 0.164 0.639
10 53 12
Center line 0.226 0.141 0.663
11 83 46
total number of defects 0.554 0.193 0.610
+. 12 93 54
total number of samples 13 78 21
0.581 0.205 0.599
7 ! 0.269
0.4017 14 41 15 0.186 0.617
7" 0.366
15 84 45 0.105 0.699
16 78 35 0.536 0.194 0.609
For each sample no. calculate 17 91 51 0.449 0.186 0.617
( 0.560 0.202 0.601
+ 18 69 25
19 36 13 0.362 0.173 0.631
20 48 9 0.361 0.085 0.719
+. 21 98 46 0.188 0.127 0.676
ABC +. 3
22 80 39 0.469 0.210 0.594
23 74 44 0.488 0.189 0.614
+. 24 60 27 0.595 0.181 0.623
CBC +. 3 25 87 29 0.450 0.156 0.647
sum 1780 715 0.333 0.198 0.606
17 PCAS
The process seems to be statistically stable.
18 PCAS
Choose a control chart
Data
Continuous Attribute
type
Data What
collected you are
in Yes counting
subgroup
No
n≤8
19
p Charts u Charts
PCAS
D. Chart-Collecting the data
Collect data in rational subgroups.
20 • PCAS
D. Chart-Collecting the data (contd.)
Collect enough data to obtain precise control limits.
When you do not have known values for the control limits, they must
be estimated from the data. To ensure that the control limits are
precise, you need an appropriate amount of data. The appropriate
amount of data depends on the subgroup size.
If the subgroup size n ≤ 2, collect at least 100 observations.
If the subgroup size n = 3, collect at least 80 observations.
If the subgroup size n = 4 or 5, collect at least 70 observations.
If the subgroup size n ≥ 6, collect at least 60 observations.
22 • PCAS
Using the D. chart
Check the R chart before you use the F chart.
When you first establish the control limits based on estimates from
your data, use Test 1, Test 2, and Test 7 to assess process control
(with the D. chart). Once the control limits are established, you
should use the known values of those limits,
23 PCAS
Using the D. chart (contd.)
Check the R chart before you use the F chart.
Control limits are typically established early in a project, during the
baseline
Estimate analysis. If you limits
new control changeonlythe when
process during
the the changes.
process project, the
baseline control limits are no longer valid, and you should establish
new
Testcontrol limitsoutside
1 (a point by re-estimating
the controlthem from
limits) the data.
detects a single out-of-
Once thepoint.
control control limits
Test 2 (9are established,
points in a row they should
on one side remain fixed to
of the center
monitor the process
line) detects or to shift
a possible assess
in process control.
the process TheTest
mean. control limits
7 (too
should
many be re-estimated
points only when
in a row within the process
1 standard changes,
deviation of thenot every
center
time
line)you collect
detects new data.
control limits that are too wide. Overly wide control
limits are often caused by stratified data, which occur when you
have a systematic source of variation within each subgroup.
24 PCAS
P chart -Collecting the data
Collect data in rational subgroups (samples, lots, etc).
25 • PCAS
P chart -Collecting the data (contd.)
Collect enough subgroups.
When you don't have known values for the control limits, they must be
Subgroups must be large enough.
estimated from the data. To estimate precise control limits, you must
collect enough subgroups. The number of subgroups required depends
on theyou
When average
don't number
have of defectives
known values forand
the the subgroup
control limits, size.
they Check that
must be
Subgroup sizes can be unequal.
the number
estimated of subgroups
from is ensure
the data. To large enough based
that the on limits
control your data.
are accurate,
the average proportion of defective items times the subgroup size must
be at least 0.5 for all subgroups. Check that the subgroup size is large
enough based on your data.
Subgroups can vary in size. For example, if a call center tracks 100
incoming calls each hour and counts ̅ G 0.5
the number of unsatisfactory wait
times, the subgroup sizes are all 100. However, if the call center tracks all
of the incoming calls during a randomly selected hour of the day, the
number of calls is likely to vary and result in unequal subgroup sizes.
The center line of the chart is not affected by unequal subgroup sizes.
However, control limits shift in response to changes in subgroup size. If
the subgroup sizes do not vary by more than 20%, you can use the
average subgroup size to make the control limits constant.
26 • PCAS
P chart -Collecting the data (contd.)
(same as D. chart)
27 • PCAS
U chart -Collecting the data
Collect data in subgroups (samples, lots).
28 • PCAS
U chart -Collecting the data (Contd.)
A subgroup
Subgroups can must
be either a single
be large unit or a collection of similarly sized
enough.
units. In either case the subgroup size can vary and may be defined
by a length of time, an area, or number of items. For example,
inspectors
Subgroupfor asizes
textile
canmanufacturer
be unequal. sample fabric pieces from rolls of
fabric and inspect them for defects:
When you don't have known values for the control limits, they must be
• If inspectors always sample fabric pieces that are the same length
estimated from the data. To obtain accurate control limits, the average
(for example,
Count the2total
meters),
number eachofsubgroup
defects forcan contain
each the
unit in a same number
subgroup. .
defect rate per unit times the subgroup size must be at least 0.5.
of fabric pieces or a different number of fabric pieces. The subgroup
size can be either the number + .ofG 0.5
fabric pieces sampled in each
Use this chart
subgroup when
or the totalyou can count
number the number
of meters of defects
of all fabric pieceson each unit.
sampled in
Instead of
each subgroup. classifying a unit as defective when you observe a defect,
continue to count
• If inspectors the total
sample number
fabric pieces ofthat
defects on the unit.
are different If you
lengths, can
each
only determine
subgroup whether
can contain theeach
sameunit is defective
number or nondefective,
of fabric use a P
pieces or a different
chart.
number of fabric pieces. Because the length of fabric pieces differs,
the subgroup size should be defined by the total length of the fabric in
each subgroup.
The center line of the chart is not affected by unequal subgroup sizes.
However, control limits shift in response to changes in subgroup size.
If 29
the subgroup
• PCAS sizes do not vary by more than 20%, use the average
subgroup size to make the control limits constant.
U chart -Using the chart
test for special causes.
30 • PCAS
Application of attribute control charts
Defectives Defects
(non conforming) (non conformities)
31 PCAS
Summary
p chart np chart c chart u chart
(fraction non (non conforming) (non (non conformities
conforming) conformities) per unit)
+.
1 ̅ +. 3
UCL ̅ 3 ̅ 3 ̅ 1 ̅ (̅ 3 (̅
CL ̅ ̅ (̅ +.
+.
LCL ̅ +. 3
(≥0)
̅ 3 ̅ 3 ̅ 1 ̅ (̅ 3 (̅
32 PCAS
33 PCAS
Dr C K Biswas
Asso Prof.
Dept of ME
PCAS 1
Content
Introduction
Double Sampling Plan (DSP)
Probability of acceptance, Pa
Operating characteristics (OC) curve
Average Outgoing Quality (AOQ)
Average Outgoing Quality Limit (AOQL)
Average Sample Number (ASN)
Average Total Inspection(ATI)
PCAS 2
Double Sampling Plan (DSP)
Inspect 1st samples of size
no of defectives in 1st sample
no of defectives in 2nd sample
yes No
Usually, 2
No
yes
yes
No
Reject the lot
Accept the lot 100 % inspection
3
Probability of acceptance, Pa
Probability of acceptance on the 1st sample,
= P( 0) + P( =1)+ P( =2) + …. P( )
!
Sample size
Use cumulative Poisson’s distribution table
row x=
column λt= p
PCAS 4
Probability of acceptance,
from Poisson’s
Pa (contd.)
from Poisson’s
e.g. DSP: 50,
cumulative 1, 100,
cumulative 3
distribution table distribution table
For p=0.05 λt= p=2.5 column λt= p=5.0 column
Use table or
6 7 89 :
Calculate using Poisson3 s formula
; !
= 0.2873
PCAS 5
Prob of
Prob of combined no
Probability of acceptance,
drawing 2nd
sample
Pa (contd.)
of defectives
PCAS 6
Probability of acceptance, Pa (contd.)
Pa is a function of p
PCAS 7
Operating characteristics (OC) curve
e.g. DSP : 50, 1, 100, 3
PCAS 8
Average Outgoing Quality (AOQ) Incoming quality p
No of defectives in lot pN
Inspect 1st samples of size
yes No
No
yes
OP
yes
No
Reject the lot
Accept the lot 100 % inspection
OP P
IJK L MIJKK L N
AOQ= L
9
Average Outgoing Quality Limit
AOQL
PCAS 10
Average Total Inspection(ATI) Incoming quality p
No of defectives in lot pN
Inspect 1st samples of size
yes No
No
yes
yes
No
Reject the lot
Accept the lot 100 % inspection
1P P O
ATI= 1P P O
11
Average Sample Number (ASN)
Inspect 1st samples of size = P{lot is accepted on the 1st sample}
+ P{lot is rejected on the 1st sample}
yes No
No
yes
yes
No
Reject the lot
Accept the lot 100 % inspection
ASN= 1P = 1P
12
PCAS 13
DSP vrs SSP
Advantage of DSP
• psychological advantage - people may feel more secure with
the idea of providing a 2nd chance to the lot of material
before it is rejected.
• if a lot is very bad or if it is very good, it can be either
rejected or accepted with smaller 1st sample
• Suppose 1st sample in DSP is smaller than for SSP
If lot is accepted or reject on 1st sample, cost of inspection is
lower
Also possible to reject a lot without completing inspection of 2nd
sample
PCAS 14
DSP vrs SSP (contd.)
Disadvantage of DSP
• DSP needs more administrative effort than SSP
• If incoming lots quality, p close to the AQL and
consistent, the SSP is more economical than DSP
PCAS 15
Problem statement
A vendor ships components in lots of size 5000. A double
sampling plan with 50, 2, 100, 6 has
been decided as acceptance inspection procedure to be
adopted. If the incoming quality have fraction defectives
p=0.05, calculate the following
i. the probability of acceptance after 1st sample is drawn.
ii. the probability of acceptance after 2nd sample is drawn.
iii. the probability of rejection after 1st sample is drawn.
iv. Average sample number.
PCAS 16
from Poisson’s cumulative from Poisson’s cumulative
PCAS 17
Solution (contd.)
i. Probability of acceptance after 1st sample, = 0.5438
PCAS 18
Solution (contd.)
iii. Probability of rejecting after 1st sample,
S = P( T 7) = 1- P( 6 = 1- 0.9858 = 0.0142
iv. ASN= 1P
Where 2) + P T7
= 0.5438 + 0.0142
= 0.558
So, ASN= 50 + 100 (1- 0.558) = 94.2
PCAS 19
PCAS 20
Dr C K Biswas
Prof.
Dept of ME
PCAS 1
Content
Introduction
Types of sampling plans
Differences Between Control Charts and Sampling Plans
Single Sampling Plan (SSP)
Probability of acceptance, Pa
Operating characteristics (OC) curve
Average Outgoing Quality (AOQ)
Average Outgoing Quality Limit (AOQL)
Average Total Inspection(ATI)
Problem statement
PCAS 2
Inspect the random samples
No yes
no of defectives in
sample <= c
PCAS 3
Acceptance sampling
Acceptance sampling is useful when
The cost of testing is high compared
The testing is destructive
100% inspection is not possible due to time constrains
Inspection error rate is high
Vendor has high quality record and process capability
ratio
Continuously monitor product quality of vendor with
good reputation
PCAS 4
Advantages of sampling plan
Less expensive due to less inspection
Less material handling
Applicable to destructive testing
Fewer personnel required for inspection
Less inspection error
Rejection of whole lot often provides motivation to
vendor to improve quality
PCAS 5
Disadvantages of sampling plan
Risk of accepting “bad” lots and rejecting “good” lots
Less information is generated about the product or
about the process that manufactures the products
It requires planning and documentation whereas 100%
inspection doesn’t
PCAS 6
Lot formation
1. Lots should be homogeneous
2. Larger lots are preferred over smaller ones
3. Lots should be conformable to the materials-
handling systems used in both the vendor and
consumer facilities.
PCAS 7
Random Sampling
The selected units for inspection from the lot should
be chosen at random,
They should be representative of all the items in the
lot.
Without random sample,
bias will be introduced in
the results.
PCAS 8
Types of sampling plans
I. By Attribute
1. Single sampling plan (SSP)
2. Double sampling plan (DSP)
3. Multiple sampling plan
4. Sequential sampling plan
5. Chain sampling plan (ChSP)
6. Continuous sampling plan (CSP)
II. By Variable
1. Variability unknown
– single sided specification limit, k method
– single sided specification limit, M method
– Double sided specification limit, M method
2. Variability known
– single sided specification limit, k method
– single sided specification limit, M method
– Double sided specification limit, M method
PCAS 9
Differences Between Control Charts
and Sampling Plans
PCAS 10
Incoming quality p
Single Sampling Plan (SSP) No of defectives in lot pN
yes No
no of defectives in
sample d <= c
11
Probability of finding defectives
PCAS 12
Hypergeometric distribution
Probability
of getting d
defectives
in sample
No of defectives in sample,
PCAS 13
Approximation of distribution
Hypergeometric to Binomial Binomial to Poisson Poisson to Normal
np = λ (≤10)
!
P(d)= 1 !
P(d)= P(d) =
!
PCAS 15
5- 10 % of
learning
useful
PCAS 16
Definitions
AQL: The Acceptable quality level is α - producer’s risk
the poorest level of quality from a
supplier's process that would be
considered acceptable as a process
average
LTPD: The Lot Tolerance Percent β - consumer’s risk
Defective is a designated high
defective level that would be
unacceptable to the consumer.
PCAS 17
Operating characteristics (OC) curve
Ideal OC curve
α=0.05 α - producer’s risk
β - consumer’s risk
β=0.1
PCAS 18
Type – A & B curves
Type- A curve when Hypergeometric distribution is
used
Type- B curve when Binomial or other distributions
are used
PCAS 19
Average Outgoing Quality (AOQ) Incoming quality p
No of defectives in lot pN
yes No
no of defectives in
sample d <= c
+,-.
"# (pN-pn) AOQ= 0 x N(1-Pa)
20
Average Outgoing Quality Limit
When n
AOQL=2.35 increases,
AOQL
decreases
PCAS 21
Average Total Inspection(ATI)
Draw random samples from lot size N
Inspect samples of size n
yes No
no of defectives in
sample d <= c
(1-Pa) (N-n)
ATI= n 0 1 "#
22
ATI vrs p
PCAS 23
Problem statement
A product is shipped in lots of size N=5000. The
receiving inspection procedure used SSP with sample
size n=100 and acceptance number c=2.
i. Draw the OC curve.
ii. Find AQL and LTPD.
iii. Draw the AOQ curve.
iv. Find AOQL
v. Draw ATI curve vrs incoming quality.
PCAS 24
Solution
From cumulative Poisson distribution table, the
following table is calculated; n=100, c=2
2 3 1 4 5
PCAS 25
Solution OC curve
1.0
i.
0.9
0.8
0.7
0.6
Pa
0.5
0.4
0.3
0.2
0.1
0.0
0.00 0.01 0.02 0.03 0.04 0.05 0.06
AQL=0.0082 p
ii. LTPD=0.0532
PCAS 26
AOQL=0.0135
iii.
PCAS 27
v.
PCAS 28
Class test 16-3-2023
A company uses the following acceptance sampling
procedure. A batch is accepted if not more than two items are
found to be defective in a sample of 50 items taken randomly
from the batch; otherwise, the batch is rejected. If 500
batches obtained from a process that manufactures 2%
defective items are submitted to this plan,
1. how many batches are accepted? (Ans: 0.9197)
2. What is the probability of getting 2 defectives in the
sample? (Ans: 0.1839)
PCAS 29
PCAS 30
MIL-STD-105E
Dr C K Biswas
Prof.
Dept of ME
1 PCAS 4/18/2023
Content
2 PCAS 4/18/2023
Standard Acceptance sampling plans
Objective Attribute Variable
Assure quality levels for Select plan for Select plan for
consumer/ producer specific OC specific OC
Maintain quality at a AQL plans: AQL plans:
target MIL-STD-105E MIL-STD-414
Assure out going quality AOQL plans: AOQL plans
level Dodge-Romig plans
Reduced inspection, with Chain sampling Narrow limit gaging
small sample size & good
quality history
Reduced inspection, with Skip lot sampling, Skip lot sampling,
good quality history double sampling double sampling
Assure quality no worse LTPD plans: LTPD plans:
than target Dodge-Romig plans Hypothesis test
3 PCAS 4/18/2023
Standard plans What is it for?
Dodge-Romig Single
AOQL-based rectifying plan for attribute data
Sampling AOQL
Dodge-Romig Single
LTPD-based rectifying plan for attribute data
Sampling LTPD
MIL-STD-1916 for Accept-on-Zero (c=0) sampling plans for attribute
Attributes data
MIL-STD-1916 for Accept-on-Zero (c=0) sampling plans for
Variables measurement data
MIL-STD-1916 for Accept-on-Zero (c=0) sampling plans for continuous
Continuous Sampling production
7 PCAS 4/18/2023
Switching Procedures
8 PCAS 4/18/2023
Switching Procedures
10 PCAS 4/18/2023
Switching Rules for normal, tightened and reduced inspection
Start
“and” conditions
• production steady
• 10 consecutive lots 2 out of 5
accepted consecutive lots
rejected
• Approved by
responsible authority
“or” conditions
• Lot rejected 5 consecutive
• Irregular production lots accepted
Example
Suppose product comes from vendor in lots of size, N
2000 units. The acceptable quality level (AQL) is
0.65%. Determine the MIL STD 105E acceptance-
sampling system.
12 PCAS 4/18/2023
Sample Size Code Letters (MIL STD 105E, Table 1)
General Inspection
Special Inspection Levels Levels
Lot or Batch Size S-1 S-2 S-3 S-4 I II III
2 to 8 A A A A A A B
9 to 15 A A A A A B C
16 to 25 A A B B B C D
26 to 50 A B B C C D E
51 to 90 B B C C C E F
91 to 150 B B C D D F G
151 to 280 B C D E E G H
281 to 500 B C D E F H J
501 to 1200 C C E F G J K
1201 to 3200 C D E G H K L
3201 to 10000 C D F G J L M
10001 to 35000 C D F H K M N
35001 to 150000 D E G J L N P
150001 to 500000 D E G J M P Q
500001 and over D E H K N Q R
13 PCAS 4/18/2023
Nonconforming or defectives (%) Nonconformities or defects / 100
Normal Inspection
AQL
Plan K
Normal Inspection:
Sample 125 units.
Ac = 2, accept the lot if no of defectives ≤ 2.
14 PCAS Re = 3, reject entire the lot if no of defectives4/18/2023
≥ 3.
Tightened inspection
AQL
Plan K
Tightened Inspection:
Sample 125 units
Ac = 1, accept the lot if defects ≤ 1.
15 PCAS
Re = 2, reject entire the lot if defects ≥4/18/2023
2.
Reduced Inspection
AQL
Plan K
Reduced Inspection:
Sample 50 units
Ac = 1, accept if defects ≤ 1.
16 PCAS Re = 3, reject entire lot if defects ≥ 4/18/2023
3.
Single sampling Plan
Reduced Normal Tightened
Inspection Inspection Inspection
Sample size, n 50 125 125
Acceptance 1 2 1
number, Ac
Rejection 3 3 2
number, Rc
17 PCAS 4/18/2023
During reduced inspection,
what happens if no of defectives in the sample is 2?
18 PCAS 4/18/2023
19 PCAS
Reliability
Dr C K Biswas
Prof.
Dept of ME
1 PCAS 4/18/2023
Content
Intro
Bathtub curve
Reliability measurement
System reliability
Series system
Parallel system
Standby Redundancy Models
M out of N Models (Identical Items)
SeriesParallel Systems
Maintainability & availability
Reliability improvement
Reliability vrs Quality
2 PCAS 4/18/2023
Reliability defination
q Reduction of things gone wrong (Johnson and
Nilsson 2003).
q An attribute of a product that describes whether the
product does what the user wants it to do, when
the user wants it to do so (Condra 2001).
q The capability of a product to meet customer
expectations of product performance over time
(Stracener 1997).
q The probability that a device, product, or system
will not fail for a given period of time under
specified operating conditions (Shishko 1995).
3 PCAS 4/18/2023
Reliability
Definition
e.g.: Reliability of a bulb is 95% for 1,000 hrs
means that the probability of bulb survival for 1,000
hrs under specified operating condition is 95%.
Failure
An item is considered to have failed under one of the three
conditions:
When it becomes completely inoperable
When it is still operable but no longer able to perform its
intended function
When serious deterioration has made it unreliable or unsafe.
5 PCAS 4/18/2023
Failure
Failure Modes in Mechanical Systems
The failure modes for mechanical components are classified into three
categories,
1. Failures due to operating load
2. Failures due to environment
3. Failures due to poor manufacturing quality.
Causes are
many aspects of design, and material selection material
imperfections, fabrication and processing, reworking,
assemble,
inspection, testing, quality control,
storage and shipment,
unanticipated exposure to overload or stresses in service.
6 PCAS 4/18/2023
Hazard Rate
The failure of a population of fielded products can arise from
inherent design weaknesses, manufacturing and quality
controlrelated problems, variability due to customer usage,
the maintenance policies of the customer, and improper use or
abuse of the product.
The hazard rate, h(t), is the number of failures per unit time
per number of nonfailed products remaining at time t.
It is “instantaneous” failure rate
7 PCAS 4/18/2023
Life characteristics pattern Bathtub
curve
Early Wearout
failure Constant failure rate failure
period
hazard rate
period period
Useful life
I II III
Time t
Life characteristics pattern
9 PCAS 4/18/2023
Modes of failures
Catastrophic failure: e.g. A blowing
Degradation (or creeping) failure: e.g.
Resistor failure
Independently failure: e.g. Failure tire of a
motor car.
Secondary failure: e.g. A resistor getting
shorted might cause extensive damage to
other components
10 PCAS 4/18/2023
Life Expectancy of Some Objects
Object Life Expectancy*
Mosquito 1 day
Light bulb 810 months
Washing machine 67 years
Dog 810 years
Motor car 1012 years
Undersea cables 40 years
Man 7080 years
House 100 years
Integrated circuits 1000 years
Plastic 1000 – 2000 years
11 PCAS 4/18/2023
Failure Rates
Component/ Failure
Part Rate/1,000 hr
Resistors 0.2210
Capacitors 0.1517
Transformers 0.2837
Connector 0.3083
Coils 0.3893
Crystal diodes 0.0750
Motors 0.1410
Tubes 3.3000
12 PCAS 4/18/2023
Mean Time Between Failures
Equipment MTBF*
Elevator 40,000 hr
Computers 40005000 hr
TV sets 500010,000 hr
Washing machines 500010,000 hr
Satellites 1012 years
Military ground 10002000 hr
equipment
Missiles 1200 hr
*The figures indicated are approximate.
13 PCAS 4/18/2023
Reliability Measurement
14
Failure rate
Four transformers failed after the following test time periods:
Transformer Fails
no after hrs
1 50
2 150
3 250
4 450
1 1
MTBF = = = 225�ℎᵅᵆ
ᵰ 0.00444
15 PCAS 4/18/2023
Failure rate (contd.)
A life testing was conducted on 12 electrical pumps for 1000 hrs of
which 5 pumps failed. The test results are as follows:
Pump Failed
no after hrs
2 620
3 390
4 780
8 860
9 950
1 1
MTBF = = = 2,120�ℎᵅᵆ
ᵰ 0.0004716
16 PCAS 4/18/2023
Reliability function
What is the reliability of pump (previous slide) if it
has to operate 10,000 hrs?
17 4/18/2023
Lamp no failure time hrs
1 5495.05
2 8817.71
Problem statement 3
4
539.66
2253.02
5 18887.00
6 2435.62
7 99.33
To ensure proper illumination in control rooms, 8 3716.24
higher reliability of electriclamps is necessary. Let 9 12155.56
us consider that the failure times (in hours) of a 10 552.75
11 3511.42
population of 30 electriclamps from a control room
12 6893.81
are given in the following Table. Calculate failure 13 1853.83
density, reliability and hazard functions. 14 3458.40
15 7710.78
16 324.61
17 866.69
18 6311.47
19 3095.62
20 927.41
21 4037.11
22 933.79
23 1485.66
24 4158.11
25 6513.43
26 8367.92
27 1912.24
18 PCAS
28 13576.97
4/18/2023
29 1843.38
30 4653.99
Lamp no failure time hrs
7 99.33
Problem statement (soln) 16
3
324.61
539.66
10 552.75
17 866.69
20 927.41
22 933.79
23 1485.66
29 1843.38
13 1853.83
27 1912.24
4 2253.02
6 2435.62
19 3095.62
14 3458.40
11 3511.42
8 3716.24
21 4037.11
24 4158.11
30 4653.99
1 5495.05
18 6311.47
25 6513.43
12 6893.81
15 7710.78
26 8367.92
2 8817.71
19 9 12155.56
4/18/2023
28 13576.97
5 18887.00
Problem statement (soln)
no survived no of
interval interval beginning failures reliability unreliability
low high ns(t) end nf(t) R(t) F(t)
0 3150 30 14 1.000 0.000
3150 6300 16 7 0.533 0.467
6300 9450 9 6 0.300 0.700
9450 12600 3 1 0.100 0.900
20 4/18/2023
12600 15750 2 1 0.067 0.933
15750 18900 1 1 0.033 0.967
Problem statement (soln)
no survived
interval interval begining no of failures reliability unreliability failure density Hazard rate
low high ns(t) end nf(t) R(t) F(t) function f(t) h(t)
0 3150 30 14 1.000 0.000 1.481E04 1.481E04
3150 6300 16 7 0.533 0.467 7.407E05 1.389E04
6300 9450 9 6 0.300 0.700 6.349E05 2.116E04
9450 12600 3 1 0.100 0.900 1.058E05 1.058E04
12600 15750 2 1 0.067 0.933 1.058E05 1.587E04
21
15750 18900 1 1 0.033 0.967 1.058E05 4/18/2023
3.175E04
Problem statement (soln)
no survived
interval interval begining no of failures reliability unreliability failure density Hazard rate
low high ns(t) end nf(t) R(t) F(t) function f(t) h(t)
0 3150 30 14 1.000 0.000 1.481E04 1.481E04
3150 6300 16 7 0.533 0.467 7.407E05 1.389E04
6300 9450 9 6 0.300 0.700 6.349E05 2.116E04
9450 12600 3 1 0.100 0.900 1.058E05 1.058E04
12600
22 15750 2 1 0.067 0.933 1.058E05 1.587E04
4/18/2023
15750 18900 1 1 0.033 0.967 1.058E05 3.175E04
System reliability and Modeling
The nature of reliability models to be used for quantitative
evaluation would depend on the type of system.
There are basically two types of system models.
Nonreparable systems: e.g. fuses, missiles, bulbs
Repairable system
i. Continuously operating system: e.g. furnaces, satellite, etc.
ii. On and off operating system: e.g. Computer, communication
equipment.
iii. Intermittently operating system: e.g. Telephone, radar, etc.
23 PCAS 4/18/2023
System reliability Series
1 2 n
RS = R1 R2 ... Rn
24
System reliability Parallel
Parallel redundancy often used under following
considerations
i. To minimize the effect of chance failure.
ii. Faulty components are difficult to detect.
iii. Assembly/ disassembly of components is not
that easy.
iv. Components of tactical importance.
v. When system need high reliability due to
strategic importance.
26 PCAS 4/18/2023
Standby Redundancy Systems
Standby units are of five different types
i. Cold Standby: The Standby units don't fail when not in operation.
ii. Hot standby: The standby units deteriorate when not in used e.g.
batteries.
iii. Tepid Standby: The reliability of the Standby units decreases
with time.
iv. Sliding standby: The sliding standby units will function when any
of the n components of the system fails
v. Sliding standby with AFL: The system has an Automatic Fault
Location (AFL) device to detect fault and automatic switching is
done to connect Standby unit. e.g. UPS
27 PCAS 4/18/2023
Standby Redundancy Models
In Figure, item A is the online active
item, and item B is standing by waiting to
be switched on to replace A when the
latter fails.
With the following assumptions
a) when operating, both items have a
constant failure rate k and have zero
failure rate in standby mode;
b) the switch is perfect;
c) switchingover time is negligible; and
d) stand by unit does not fail while in
standby mode.
28 PCAS 4/18/2023
Standby Redundancy Models
29 PCAS 4/18/2023
Problem statement
A typical UPS (Uninterrupted Power Supply) circuit is shown in the
figure. Normally, the power is supplied to the load by the AC supply
and the battery is charged by the rectifier. In case of power failure,
which is detected by Automatic Fault Location (AFL), the rectifier is
disconnected and the power is supplied to the load by the battery via
inverter. Given the failure rates of components (as shown in Table),
calculate the UPS reliability for 1000 hr.
30 PCAS 4/18/2023
Reliability Block Diagram
31 PCAS 4/18/2023
Reliability Block Diagram
32 PCAS 4/18/2023
Standby Redundancy Models with AFL
33 PCAS 4/18/2023
M out of N Models (Identical Items)
34 PCAS 4/18/2023
SeriesParallel Systems
C
RA RB RD
RC
A B D
C
RC Convert to
equivalent series
system
RA RB RD
A B C’ D
RC’ = 1 – (1-RC)(1-RC)
35
Problem statement
36 PCAS 4/18/2023
Maintainability
Maintainability is the totality of design
factors that allows maintenance to be
accomplished easily
Preventive maintenance reduces the risk
of failure
Corrective maintenance is the response
to failures
37
Availability
Operational availability
(as seen by user)
Inherent availability
(as seen by maintenance personnel)
MTBM = mean time between maintenance
MTD = mean down time
MTBF = mean time between failures
MTTR = mean time to repair
38
Reliability improvement
Standardization
Redundancy
Physics of failure
Reliability testing
Burnin
Failure mode and effects analysis
Fault tree analysis
39 PCAS 4/18/2023
Reliability testing
To determine whether the equipment under test meets its
reliability requirement
The criterion for acceptability is minimum acceptable life
Three types of reliability tests
I. Tests terminated upon occurrence of a number of
preassigned failures
a. With Replacement of failed items
b. Without Replacement of failed items
40 PCAS 4/18/2023
Tests terminated upon occurrence of a number of
preassigned failures With Replacement of failed item
41 4/18/2023
PCAS
Soln (contd.)
For, a probability of 90%, = 0.10 and r=6, find k from table.
42 PCAS 4/18/2023
Solu. (contd.)
The choice of sample size, n and the no. preassigned
failures to terminate life test, r is decided based on
cost of equipment and testtime consideration.
43 PCAS 4/18/2023
Reliability vrs Quality
Reliability Quality
Its function of time No time element
Probability of Performance Degree of conformance with
standard, specifications & design
requirement
Not necessarily meant high High quality
reliability
44 PCAS 4/18/2023
Reliability and safety
The term safety mean absolute safety, in the sense
that the system will never fail during operation.
In reality such thing is not possible to achieve.
Hence, in all design purpose a safety factor is
taken.
The choice of factor of safety (FOS) depends upon
the acceptable level of risk on the basis of economic
and/or social consequences associated with such
failures in the field.
45 PCAS 4/18/2023
Risk
Risk is generally defined as “ the chance of injury or loss
resulting from exposure to a source of danger ” ,
While safety means “ freedom from danger or hazard ” .
Risk = probability x consequences
46 PCAS 4/18/2023
Risk
X 100 500 750 1000 1500 2500 5000 7000 8000 10,000
P(X>x) 0.1869 0.1769 0.1569 0.1269 0.0869 0.0069 0.0059 0.0009 0.0008 0.0005
47 4/18/2023
End of lecture
Thank yo
u for you
kind atte r
ntion
51 PCAS