Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 5

53-1106-006-4

53-1106-020-5

1. [Within Appraisers]
2. [Each
appraiser vs standard]
3. [Between appraisers]
4. [All appraisers
versus standard]
5.

Attribute Data Binary [Accept&Reject] 2 2


4

( ) 30
1


- 30 (11) ( Run
Order Work Sheet Minitab)

( A
R) Run Order

Minitab
- 2
. (Appraiser I) (Appraiser II)

MINITAB
Date of study :
Reported by:
Name of product:
Misc:

Assessment Agreement

Within Appraisers

Appraiser vs Standard

100

100

95.0% C I
P ercent

80

Percent

Percent

80

95.0% C I
P ercent

60

60

40

40

20

20
1

Appraiser

2
Appraiser

.
Attribute Agreement Analysis for Assessment
Within Appraisers
Assessment Agreement
Appraiser
1
2

# Inspected
30
30

# Matched
30
24

Percent
100.00
80.00

95% CI
(90.50, 100.00)
(61.43, 92.29)

# Matched: Appraiser agrees with him/herself across trials.


Fleiss' Kappa Statistics
Appraiser
1
2

Response
A
R
A
R

Kappa
1.0
1.0
0.6
0.6

SE Kappa
0.182574
0.182574
0.182574
0.182574

Z
5.47723
5.47723
3.28634
3.28634

P(vs > 0)
0.0000
0.0000
0.0005
0.0005

Within Appraisers . Kappa=1 Kappa=0.6<0.7

Each Appraiser vs Standard


Assessment Agreement
Appraiser
1
2

# Inspected
30
30

# Matched
18
10

Percent
60.00
33.33

95% CI
(40.60, 77.34)
(17.29, 52.81)

# Matched: Appraiser's assessment across trials agrees with the known standard.
Assessment Disagreement
Appraiser
1
2

# R / A
5
7

Percent
31.25
43.75

# A / R
7
7

Percent
50.00
50.00

# Mixed
0
6

Percent
0.00
20.00

# R / A: Assessments across trials = R / standard = A.


# A / R: Assessments across trials = A / standard = R.
# Mixed: Assessments across trials are not identical.
Fleiss' Kappa Statistics
Appraiser
1
2

Response
A
R
A
R

Kappa
0.185520
0.185520
-0.134594
-0.134594

SE Kappa
0.129099
0.129099
0.129099
0.129099

Z
1.43703
1.43703
-1.04256
-1.04256

P(vs > 0)
0.0754
0.0754
0.8514
0.8514

Each Appraiser vs Standard . 18 60%


. Type1 Error 5 Type2 Error 7 Kappa=0.185
10 Type1 Error 7 Type2 Error 7 6 Kappa=0.134

Between Appraisers
Assessment Agreement
# Inspected
30

# Matched
12

Percent
40.00

95% CI
(22.66, 59.40)

# Matched: All appraisers' assessments agree with each other.


Fleiss' Kappa Statistics
Response
A
R

Kappa
0.259259
0.259259

SE Kappa
0.0745356
0.0745356

Z
3.47833
3.47833

P(vs > 0)
0.0003
0.0003

Between Appraisers 2 12 Kappa=0.259


< 0.7

All Appraisers vs Standard


Assessment Agreement
# Inspected
30

# Matched
6

Percent
20.00

95% CI
(7.71, 38.57)

# Matched: All appraisers' assessments agree with the known standard.


Fleiss' Kappa Statistics
Response
A
R

Kappa
0.0254632
0.0254632

SE Kappa
0.0912871
0.0912871

Z
0.278935
0.278935

P(vs > 0)
0.3901
0.3901

All Appraisers vs Standard 2 6


Kappa=0.025 < 0.7

Summary of Assessment Disagreement with Standard


Appraisers
Sample
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30

Standard
A
A
R
R
A
A
R
R
A
A
R
R
A
A
R
R
A
A
R
R
A
A
R
R
A
A
R
R
A
A

Count
0
0
0
2
2
2
0
0
0
2
0
0
0
0
2
2
2
0
2
2
0
0
2
2
2
0
0
0
0
0

1
Percent
0.00
0.00
0.00
100.00
100.00
100.00
0.00
0.00
0.00
100.00
0.00
0.00
0.00
0.00
100.00
100.00
100.00
0.00
100.00
100.00
0.00
0.00
100.00
100.00
100.00
0.00
0.00
0.00
0.00
0.00

Count
2
2
0
2
0
2
0
0
2
0
2
2
2
2
2
2
2
1
0
2
0
1
1
1
0
1
0
2
0
1

2
Percent
100.00
100.00
0.00
100.00
0.00
100.00
0.00
0.00
100.00
0.00
100.00
100.00
100.00
100.00
100.00
100.00
100.00
50.00
0.00
100.00
0.00
50.00
50.00
50.00
0.00
50.00
0.00
100.00
0.00
50.00

Percent % 0%2 100% 2


50% 100% 2 4,6

You might also like