CA 1012134 Agc A01 A01 MSA GRR 3. Process Bracket Assembly

You might also like

Download as xlsx, pdf, or txt
Download as xlsx, pdf, or txt
You are on page 1of 25

Location - Aguascalientes - Standard CA 1012134–AGC-A01 Administración de análisis del sistema de medición

- Annex 01 - Attributive GRR

Location

Standard

CA 1012134–AGC-A01 Administración de análisis del sistema de medición

Annex ´01

Attributive GRR

Document History: CA 1012134–AGC-A01-A01

Version Responsible Function Details Effective

1 N/A N/A N/A

.-Creación de Anexo A01 para GRR atributivos, se


actualizan líneas en la RASI para agregar la
QMS Leader, Jose referencia del uso de este template.
2 Marzo 4, 2021
Palomera - Se actualizan numero de anexos corporativos del
procedimiento CA 1012134 por cambios en regla
superior.
ATTRIBUTIVE GRR CA 1012134–AGC-A01-A01

Instructions:
The following spreadsheet is used to calculate an attribute MSA, in which 50 samples will be evaluated
using 3 operators and 3 trials per sample.

1. Please fill out the general information of attributive MSA as follow:

Please specify the general product information Please, specify the general gauge /
as product name, part number, characteristic equipment information as gauge name,
that the operators are checking and the name identification, the tolerance of the product for
of the person who perform the MSA to the PASS / FAIL decision and the date on which
operators. the MSA was conducted.

Please, specify the complete operator name and


his/her employee identification number.

2. Enter operator result of each inspection:

Result of each operator per trial. True value of


Only 0 and 1 values shall be captured. the unit

Internal CA XXXXXXX
© Continental AG. 2018 Version 1
0 = Not OK parts
1 = OK parts

Code column can be used to identify You can add the


not OK parts (-), OK parts (+) and measurement
parts that are in the limit (x) value of each unit.

3. Evaluate MSA result (kappa and operator effectiveness) according acceptance criteria:

Internal CA XXXXXXX
© Continental AG. 2018 Version 1
Troubleshooting:

Attribute Agreement Analysis consists of creating cross tabulation tables that display the observed frequency
distribution of potential outcomes (Accept or Reject) using tables of counts.

Kappa statistic is the main metric used to measure how good or bad an attribute measurement system is.
The Kappa statistic is used to summarize the level of agreement between raters after agreement by chance
has been removed. It tests how well raters agree with themselves (repeatability) and with each other
(reproducibility).

The attribute agreement analysis uses the following cross tabulations:

a) Pairwise Agreement Between Appraisers – Per Evaluation (Reproducibility)


b) Agreement Between Each Appraiser versus the Reference Standard – Per Evaluation (Bias)
c) Agreement Within Appraisers – Across Trials (Repeatability)
d) Agreement Between Each Appraiser versus the Reference Standard – Across Trials (Bias)
e) Agreement Between Appraisers – Across Trials (Reproducibility)
f) Agreement Across All Appraisers versus the Reference Standard (Bias)

Sources of Variability

AIAG MSA reference manual and VDA 5 gives us some potential sources of variation in the measurement
system, we can use it for problem solving and risk analysis.

VDA 5 approach; Section 4.1 Influence causing the uncertainty of measurament result; Page 26

Internal CA XXXXXXX
© Continental AG. 2018 Version 1
AIAG approach; Section B, source of variation; Page 17

Internal CA XXXXXXX
© Continental AG. 2018 Version 1
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
Internal CA XXXXXXX
© Continental AG. 2018 Version 1
ATTRIBUTIVE GRR CA 1012134–AGC-A01-A01

Product name: Rodolfo Maya Gauge/equipment name:


Product part number: Gauge/equipment ID:
Characteristic inspected: Tolerance:
Operation/process: Date:
Performed by: Area:
Operator A name: ID number:
Operator B name: ID number:
Operator C name: ID number:

OPERADOR A OPERADOR B OPERADOR C Ref.


Part # Attribute N mm
Trial 1 Trial 2 Trial 3 Trial 1 Trial 2 Trial 3 Trial 1 Trial 2 Trial 3 Value
1 1 740.410 208.680
2 1 716.18 208.7
3 0 781.95 208.76
4 1 754.73 208.75
5 0 786.68 208.76
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48

Internal CA 0608454
© Continental AG. 2018 Version 1
49
50

Internal CA 0608454
© Continental AG. 2018 Version 1
Numerical Analysis

A * B Cross Tabulation A * Ref Cross Tabulation


B Ref
00 1.00 Total 00 1.00 Total
A 00 Count 0 0 0 A 00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
1.00 Count 0 0 0 1.00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
Total Count 0 0 0 Total Count 0 0 0
Expected count 0.00 0.00 0.00 Expected count 0.00 0.00 0.00
B * C Cross Tabulation B * Ref Cross Tabulation
C Total Ref Total
00 1.00 00 1.00
B 00 Count 0 0 0 B 00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
1.00 Count 0 0 0 1.00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
Total Count 0 0 0 Total Count 0 0 0
Expected count 0.00 0.00 0.00 Expected count 0.00 0.00 0.00
A * C Cross Tabulation C * Ref Cross Tabulation
C Total Ref Total
00 1.00 00 1.00
A 00 Count 0 0 0 C 00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
1.00 Count 0 0 0 1.00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
Total Count 0 0 0 Total Count 0 0 0
Expected count 0.00 0.00 0.00 Expected count 0.00 0.00 0.00

a) Pairwise Agreement Between Appraisers – Per b) Agreement Between Each Appraiser versus the
Evaluation (Reproducibility) Reference Standard – Per Evaluation (Bias)

A B C A B C
B * B Cross Tabu
A ---- Kappa

B ----

C ----

c) Agreement Within Appraisers – Across Trials


Source: A B C (Repeatability)
Number Evaluated:
Number Matched: A B C
Number Not Matched:
Kappa
False Reject:
False Accept:
Mixed: d) Agreement Between Each Appraiser versus the
Upper 95% Conf. Bound: 1 1 1 Reference Standard – Across Trials (Bias)
Proportion Matched (p):
Lower 95% Conf. Bound: A B C
Kappa

Internal CA 0608454
© Continental AG. 2018 Version 1
e) Agreement Between Appraisers – Across Trials f) Agreement Across All Appraisers versus the Reference
(Reproducibility) Standard (Bias)
A * B * C Cross Ta
Number Evaluated: 5 Number Evaluated: 5
Number Matched: 0 Number Matched: 0
Number Not Matched: 0 Number Not Matched: 0
Upper 95% Conf. Bound: 0.52182 Upper 95% Conf. Bound: 0.521824
Proportion Matched: 0.0000 Proportion Matched: 0.0000
Lower 95% Conf. Bound: 0 Lower 95% Conf. Bound: 0

Kappa: Kappa:

g) Effectiveness

Effectiveness Miss Rate False Alaram Rate


A

Acceptance Criteria

Decision Measurement system Effectiveness Miss Rate False Alaram Rate


Acceptable for the Appraisers ≥ 90% ≤ 2% ≤ 5%
Marginally Acceptable for the Appraiser - May
need Improvement
≥ 80% ≤ 5% ≤ 10%

Unacceptable for the Appraiser - Needs


< 80% >5% >10%
improvement

Kappa Decision Comments

0.75 < Kappa ≤ 1 Acceptable Indicates good to excellent agreement

Decision should be based upon importance of


May be acceptable for application measurement, cost of measurement
0.40 ≤ Kappa ≤ 0.75
some applications device, and cost of rework or repair.
Acceptance shall be approved by the customer.

0 < Kappa < 0.40 Unacceptable Indicates poor agreement

Internal CA 0608454
© Continental AG. 2018 Version 1
ATTRIBUTIVE GRR CA 1012134–AGC-A01-A01

Product name: Rodolfo Maya Gauge/equipment name:


Product part number: Gauge/equipment ID:
Characteristic inspected: Tolerance:
Operation/process: Date:
Performed by: Area:
Operator A name: ID number:
Operator B name: ID number:
Operator C name: ID number:

OPERADOR A OPERADOR B OPERADOR C Ref.


Part # Attribute N mm
Trial 1 Trial 2 Trial 3 Trial 1 Trial 2 Trial 3 Trial 1 Trial 2 Trial 3 Value
1 1 740.410 208.680
2 1 716.18 208.7
3 0 781.95 208.76
4 1 754.73 208.75
5 0 786.68 208.76
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48

Internal CA 0608454
© Continental AG. 2018 Version 1
49
50

Internal CA 0608454
© Continental AG. 2018 Version 1
Numerical Analysis

A * B Cross Tabulation A * Ref Cross Tabulation


B Ref
00 1.00 Total 00 1.00 Total
A 00 Count 0 0 0 A 00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
1.00 Count 0 0 0 1.00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
Total Count 0 0 0 Total Count 0 0 0
Expected count 0.00 0.00 0.00 Expected count 0.00 0.00 0.00
B * C Cross Tabulation B * Ref Cross Tabulation
C Total Ref Total
00 1.00 00 1.00
B 00 Count 0 0 0 B 00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
1.00 Count 0 0 0 1.00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
Total Count 0 0 0 Total Count 0 0 0
Expected count 0.00 0.00 0.00 Expected count 0.00 0.00 0.00
A * C Cross Tabulation C * Ref Cross Tabulation
C Total Ref Total
00 1.00 00 1.00
A 00 Count 0 0 0 C 00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
1.00 Count 0 0 0 1.00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
Total Count 0 0 0 Total Count 0 0 0
Expected count 0.00 0.00 0.00 Expected count 0.00 0.00 0.00

a) Pairwise Agreement Between Appraisers – Per b) Agreement Between Each Appraiser versus the
Evaluation (Reproducibility) Reference Standard – Per Evaluation (Bias)

A B C A B C
B * B Cross Tabu
A ---- Kappa

B ----

C ----

c) Agreement Within Appraisers – Across Trials


Source: A B C (Repeatability)
Number Evaluated:
Number Matched: A B C
Number Not Matched:
Kappa
False Reject:
False Accept:
Mixed: d) Agreement Between Each Appraiser versus the
Upper 95% Conf. Bound: 1 1 1 Reference Standard – Across Trials (Bias)
Proportion Matched (p):
Lower 95% Conf. Bound: A B C
Kappa

Internal CA 0608454
© Continental AG. 2018 Version 1
e) Agreement Between Appraisers – Across Trials f) Agreement Across All Appraisers versus the Reference
(Reproducibility) Standard (Bias)
A * B * C Cross Ta
Number Evaluated: 5 Number Evaluated: 5
Number Matched: 0 Number Matched: 0
Number Not Matched: 0 Number Not Matched: 0
Upper 95% Conf. Bound: 0.52182 Upper 95% Conf. Bound: 0.521824
Proportion Matched: 0.0000 Proportion Matched: 0.0000
Lower 95% Conf. Bound: 0 Lower 95% Conf. Bound: 0

Kappa: Kappa:

g) Effectiveness

Effectiveness Miss Rate False Alaram Rate


A

Acceptance Criteria

Decision Measurement system Effectiveness Miss Rate False Alaram Rate


Acceptable for the Appraisers ≥ 90% ≤ 2% ≤ 5%
Marginally Acceptable for the Appraiser - May
need Improvement
≥ 80% ≤ 5% ≤ 10%

Unacceptable for the Appraiser - Needs


< 80% >5% >10%
improvement

Kappa Decision Comments

0.75 < Kappa ≤ 1 Acceptable Indicates good to excellent agreement

Decision should be based upon importance of


May be acceptable for application measurement, cost of measurement
0.40 ≤ Kappa ≤ 0.75
some applications device, and cost of rework or repair.
Acceptance shall be approved by the customer.

0 < Kappa < 0.40 Unacceptable Indicates poor agreement

Internal CA 0608454
© Continental AG. 2018 Version 1
ATTRIBUTIVE GRR CA 1012134–AGC-A01-A01

Product name: Rodolfo Maya Gauge/equipment name:


Product part number: Gauge/equipment ID:
Characteristic inspected: Tolerance:
Operation/process: Date:
Performed by: Area:
Operator A name: ID number:
Operator B name: ID number:
Operator C name: ID number:

OPERADOR A OPERADOR B OPERADOR C Ref.


Part # Attribute N mm
Trial 1 Trial 2 Trial 3 Trial 1 Trial 2 Trial 3 Trial 1 Trial 2 Trial 3 Value
1 1
2 1
3 1
4 1
5 1
6 1
7 1
8 1
9 1
10 1
11 1
12 1
13 1
14 1
15 1
16 1
17 1
18 1
19 1
20 1
21 1
22 1
23 1
24 1
25 1
26 1
27 1
28 1
29 1
30 1
31 1
32 1
33 1
34 1
35 1
36 1
37 1
38 1
39 1
40 1
41 1
42 1
43 1
44 1
45 1
46 1
47 1
48 1

Internal CA 0608454
© Continental AG. 2018 Version 1
49 1
50 1

Internal CA 0608454
© Continental AG. 2018 Version 1
Numerical Analysis

A * B Cross Tabulation A * Ref Cross Tabulation


B Ref
00 1.00 Total 00 1.00 Total
A 00 Count 0 0 0 A 00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
1.00 Count 0 0 0 1.00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
Total Count 0 0 0 Total Count 0 0 0
Expected count 0.00 0.00 0.00 Expected count 0.00 0.00 0.00
B * C Cross Tabulation B * Ref Cross Tabulation
C Total Ref Total
00 1.00 00 1.00
B 00 Count 0 0 0 B 00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
1.00 Count 0 0 0 1.00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
Total Count 0 0 0 Total Count 0 0 0
Expected count 0.00 0.00 0.00 Expected count 0.00 0.00 0.00
A * C Cross Tabulation C * Ref Cross Tabulation
C Total Ref Total
00 1.00 00 1.00
A 00 Count 0 0 0 C 00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
1.00 Count 0 0 0 1.00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
Total Count 0 0 0 Total Count 0 0 0
Expected count 0.00 0.00 0.00 Expected count 0.00 0.00 0.00

a) Pairwise Agreement Between Appraisers – Per b) Agreement Between Each Appraiser versus the
Evaluation (Reproducibility) Reference Standard – Per Evaluation (Bias)

A B C A B C
B * B Cross Tabu
A ---- Kappa

B ----

C ----

c) Agreement Within Appraisers – Across Trials


Source: A B C (Repeatability)
Number Evaluated:
Number Matched: A B C
Number Not Matched:
Kappa
False Reject:
False Accept:
Mixed: d) Agreement Between Each Appraiser versus the
Upper 95% Conf. Bound: 1 1 1 Reference Standard – Across Trials (Bias)
Proportion Matched (p):
Lower 95% Conf. Bound: A B C
Kappa

Internal CA 0608454
© Continental AG. 2018 Version 1
e) Agreement Between Appraisers – Across Trials f) Agreement Across All Appraisers versus the Reference
(Reproducibility) Standard (Bias)
A * B * C Cross Ta
Number Evaluated: 50 Number Evaluated: 50
Number Matched: 0 Number Matched: 0
Number Not Matched: 0 Number Not Matched: 0
Upper 95% Conf. Bound: 0.07112 Upper 95% Conf. Bound: 0.071122
Proportion Matched: 0.0000 Proportion Matched: 0.0000
Lower 95% Conf. Bound: 0 Lower 95% Conf. Bound: 0

Kappa: Kappa:

g) Effectiveness

Effectiveness Miss Rate False Alaram Rate


A

Acceptance Criteria

Decision Measurement system Effectiveness Miss Rate False Alaram Rate


Acceptable for the Appraisers ≥ 90% ≤ 2% ≤ 5%
Marginally Acceptable for the Appraiser - May
need Improvement
≥ 80% ≤ 5% ≤ 10%

Unacceptable for the Appraiser - Needs


< 80% >5% >10%
improvement

Kappa Decision Comments

0.75 < Kappa ≤ 1 Acceptable Indicates good to excellent agreement

Decision should be based upon importance of


May be acceptable for application measurement, cost of measurement
0.40 ≤ Kappa ≤ 0.75
some applications device, and cost of rework or repair.
Acceptance shall be approved by the customer.

0 < Kappa < 0.40 Unacceptable Indicates poor agreement

Internal CA 0608454
© Continental AG. 2018 Version 1
ATTRIBUTIVE GRR CA 1012134–AGC-A01-A01

Product name: Rodolfo Maya Gauge/equipment name:


Product part number: Gauge/equipment ID:
Characteristic inspected: Tolerance:
Operation/process: Date:
Performed by: Area:
Operator A name: ID number:
Operator B name: ID number:
Operator C name: ID number:

OPERADOR A OPERADOR B OPERADOR C Ref.


Part # Attribute N mm
Trial 1 Trial 2 Trial 3 Trial 1 Trial 2 Trial 3 Trial 1 Trial 2 Trial 3 Value
1 1 740.410 208.680
2 1 716.18 208.7
3 0 781.95 208.76
4 1 754.73 208.75
5 0 786.68 208.76
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48

Internal CA 0608454
© Continental AG. 2018 Version 1
49
50

Internal CA 0608454
© Continental AG. 2018 Version 1
Numerical Analysis

A * B Cross Tabulation A * Ref Cross Tabulation


B Ref
00 1.00 Total 00 1.00 Total
A 00 Count 0 0 0 A 00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
1.00 Count 0 0 0 1.00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
Total Count 0 0 0 Total Count 0 0 0
Expected count 0.00 0.00 0.00 Expected count 0.00 0.00 0.00
B * C Cross Tabulation B * Ref Cross Tabulation
C Total Ref Total
00 1.00 00 1.00
B 00 Count 0 0 0 B 00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
1.00 Count 0 0 0 1.00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
Total Count 0 0 0 Total Count 0 0 0
Expected count 0.00 0.00 0.00 Expected count 0.00 0.00 0.00
A * C Cross Tabulation C * Ref Cross Tabulation
C Total Ref Total
00 1.00 00 1.00
A 00 Count 0 0 0 C 00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
1.00 Count 0 0 0 1.00 Count 0 0 0
Expected count 0.0 0.0 0.00 Expected count 0.0 0.0 0.00
Total Count 0 0 0 Total Count 0 0 0
Expected count 0.00 0.00 0.00 Expected count 0.00 0.00 0.00

a) Pairwise Agreement Between Appraisers – Per b) Agreement Between Each Appraiser versus the
Evaluation (Reproducibility) Reference Standard – Per Evaluation (Bias)

A B C A B C
B * B Cross Tabu
A ---- Kappa

B ----

C ----

c) Agreement Within Appraisers – Across Trials


Source: A B C (Repeatability)
Number Evaluated:
Number Matched: A B C
Number Not Matched:
Kappa
False Reject:
False Accept:
Mixed: d) Agreement Between Each Appraiser versus the
Upper 95% Conf. Bound: 1 1 1 Reference Standard – Across Trials (Bias)
Proportion Matched (p):
Lower 95% Conf. Bound: A B C
Kappa

Internal CA 0608454
© Continental AG. 2018 Version 1
e) Agreement Between Appraisers – Across Trials f) Agreement Across All Appraisers versus the Reference
(Reproducibility) Standard (Bias)
A * B * C Cross Ta
Number Evaluated: 5 Number Evaluated: 5
Number Matched: 0 Number Matched: 0
Number Not Matched: 0 Number Not Matched: 0
Upper 95% Conf. Bound: 0.52182 Upper 95% Conf. Bound: 0.521824
Proportion Matched: 0.0000 Proportion Matched: 0.0000
Lower 95% Conf. Bound: 0 Lower 95% Conf. Bound: 0

Kappa: Kappa:

g) Effectiveness

Effectiveness Miss Rate False Alaram Rate


A

Acceptance Criteria

Decision Measurement system Effectiveness Miss Rate False Alaram Rate


Acceptable for the Appraisers ≥ 90% ≤ 2% ≤ 5%
Marginally Acceptable for the Appraiser - May
need Improvement
≥ 80% ≤ 5% ≤ 10%

Unacceptable for the Appraiser - Needs


< 80% >5% >10%
improvement

Kappa Decision Comments

0.75 < Kappa ≤ 1 Acceptable Indicates good to excellent agreement

Decision should be based upon importance of


May be acceptable for application measurement, cost of measurement
0.40 ≤ Kappa ≤ 0.75
some applications device, and cost of rework or repair.
Acceptance shall be approved by the customer.

0 < Kappa < 0.40 Unacceptable Indicates poor agreement

Internal CA 0608454
© Continental AG. 2018 Version 1

You might also like