AL1-Module10-Item Analysis

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Republic of the Philippines

NUEVA VIZCAYA STATE UNIVERSITY


Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM NO.: IM-PROFED6-B-1STSEM-2023-2024

COLLEGE OF TEACHER EDUCATION


Bayombong Campus

DEGREE PROGRAMS BPE and BTLE* COURSE NO. PROF ED 6-B


SPECIALIZATION *IA, ICT, HE, AFA COURSE TITLE Assessment in Learning 1
YEAR LEVEL 3 and 2 TIME FRAME 3 Hrs. WK NO. 12 IM NO. 10

I. UNIT 2: TEST CONSTRUCTION

II. LESSON TITLE: ITEM ANALYSIS

III. LESSON OVERVIEW

There is a need to determine how good a test item is, especially in coming up with a standardized
test. The difficulty/facility and discrimination indexes need to be established, even the reliability of the
test.

In this lesson, after the students construct their own tests based on their instructional
objectives/learning outcomes and table of specifications, they need to administer the test, and subject
the test (Multiple Choice items) to item analysis,

Thus, the students would experience how to determine the difficulty and discrimination indexes
of the test items and decide whether to retain, revise, and or reject them.

IV. DESIRED LEARNING OUTCOMES

After constructing and administering their own constructed tests, the students must be able to:
1. conduct item analysis;
2. analyze results; and
3. recommend appropriate actions.

V. LESSON CONTENT

ITEM ANALYSIS
Item analysis is the process of examining the pupil’s responses to each test item. One method
that can be used for Item Analysis is the U-L Method (Stocklein, 1957).

The steps are:


1. Score the papers and rank them from the highest to lowest according to the total score.
2. Separate the top 27% and the bottom 27% of the papers.
3. Tally the responses made to each test item by each individual in the upper 27% group.
4. Tally the responses made to each test item by each individual in the lower 27% group
5. Compute the percentage of the upper group that got the item right and call it U.
6. Compute the percentage of the lower group that got the item right and call it L.
7. Average U and L percentage and the result is the difficulty index of the item.
8. Subtract the L percentage from the U percentage and the result is the discrimination index.

After the item analysis, this table of equivalents can be used in interpreting the difficulty index:
.00 - .20 = Very difficult
.21 - .80 = Moderately Difficult
.81 - 1.00 = Very Easy
“In accordance with Section 185, Fair Use of Copyrighted Work of Republic Act 8293, the copyrighted works included in this material may be reproduced for educational purposes
only and not for commercial distribution.”
NVSU-FR-ICD-05-00 (081220)
Page 1 of 4
Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM NO.: IM-PROFED6-B-1STSEM-2023-2024

Other Tables of Equivalents:

A. Difficulty Index

Index Descriptive Index Recommendation/Remarks

.91 and above Very Easy not acceptable, revise or discard


.76 - .90 Easy limited acceptability
.26 - .75 Moderate Difficult very acceptable; include all
.11 - .25 Difficult limited in acceptability
.10 & below Very Difficult include only one or two, revise/discard others

B. Discrimination Index

.41 & above High very acceptable, include all


.40 - .20 Moderate Acceptable; include all
.19 & below Low limited in acceptability include only one or two,
revise or discard others

Other Terminologies:

Difficulty Index – the percentage of the pupils who got the item right. It can also be interpreted as how
easy or how difficult an item is

Discrimination Index – separates the bright pupils from the poor ones. Thus, a good test item separates
the bright from the poor pupils.

Example:
Item Upper Lower U L Diff. Disc. Remark
No. 27% 27% Index Index
1 14 12 .86 .74 .81 .13 Revise
2 10 6 .62 .37 .51 .25 Retain
3 6 14 .38 .88 .63 -.5 Reject
4 11 7
5 9 2
6 12 6
7 13 4
8 3 13
9 13 12
10 8 6

No. of pupils tested: 60


27% of 60 = 16

U = Upper 27%/27% = 14/16 = .86


L = Lower 27%/27% = 12/16 = .74
Difficulty Index = .86 +.74 /2 = .81
Discrimination Index = .86 - .74 = .13

Analyzing and Using Test Item Data


Purposes of Item Analysis
• To select the best available items for the final draft of the test
• To identify structural or content defects in the items
• To detect learning difficulties of the class as a whole
• To identify the areas of weaknesses of students in need of remediation
“In accordance with Section 185, Fair Use of Copyrighted Work of Republic Act 8293, the copyrighted works included in this material may be reproduced for educational purposes
only and not for commercial distribution.”
NVSU-FR-ICD-05-00 (081220)
Page 2 of 4
Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM NO.: IM-PROFED6-B-1STSEM-2023-2024

Main Elements in an Item Analysis


• Examination of the difficulty level of the items
• Determination of the discriminating power of each item
• Examination of the effectiveness of distracters in multiple-choice or matching items

Examining Distractor Effectiveness


An ideal item is one that all students in the upper group answer correctly and all students in the
lower group answer wrongly. Moreover, the responses of the lower group have to be evenly distributed
among the incorrect alternatives.
Two procedures in examining the effectiveness of the distracters in a multiple-choice item:

• The answers to the different distracters be counted for the upper and lower groups.
✓ Good distracters are those chosen more frequently by students from the lower group.
✓ When a particular distractor is selected more frequently by those from the upper group, the
teacher has to revise it.
✓ The analysis of distracters may also reveal some chosen by no one in either of the two
groups of students. Such distracters should therefore be revised to make them more useful.
✓ An illustration of the foregoing analysis is shown below:

Item No. Group Responses


A B C D E
1 Upper 0 3 1 12 5
Lower 0 7 0 7 6
D = correct answer

• Another method of analyzing the effectiveness of distracters is by determining the mean score of
the students who respond to each distractor, as well as that of those who choose the correct
answer. As pointed out by Downie and Health (1984), good distracters have lower mean scores
than those related to the correct response.

VI. LEARNING ACTIVITIES / ASSIGNMENT

Individual Activity: Compute the difficulty/facility and discrimination indexes and give your
remark on whether to retain, revise, or reject each of the items.

Item Upper Lower U L Diff. Disc. Remark


No. 27% 27% Index Index
1 20 18
2 19 12
3 17 11
4 10 20
5 21 11

No. of students tested: 100


27% of 60 = 27

VII. EVALUATION

Classroom Participation and/or Quiz

Portfolio/Project Entry. A major output is required of the students, the content of which starts from
formulation of instructional objectives to making of TOS and test, then administration and item analysis,
scoring, and application of the statistical tools, to the computation of grades.
“In accordance with Section 185, Fair Use of Copyrighted Work of Republic Act 8293, the copyrighted works included in this material may be reproduced for educational purposes
only and not for commercial distribution.”
NVSU-FR-ICD-05-00 (081220)
Page 3 of 4
Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM NO.: IM-PROFED6-B-1STSEM-2023-2024

The test constructed by the students will be submitted for the teacher’s input/editing. After
returning to the students, they will revise/improve the test, administer it, score it, gather the needed data,
and do item analysis.

VIII. REFERENCES

Teaching Guide in Assessment in Learning 1


Other references indicated in the Course Outline

“In accordance with Section 185, Fair Use of Copyrighted Work of Republic Act 8293, the copyrighted works included in this material may be reproduced for educational purposes
only and not for commercial distribution.”
NVSU-FR-ICD-05-00 (081220)
Page 4 of 4

You might also like