Task 3

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 3

RYAN A.

PANGALUAN
TASK 3.

1. Stages of Test Construction


a. Planning
The first step in constructing a test is to determine the content, the scope of the
test, as well as the manner in which test items should be developed. This process is
referred to as “planning a test”. In the planning stage, one should determine the
materials upon which test items are to be based. For example, to construct a
grammar test, all grammatical elements, that could be potentially included in the
test, should be listed.
b. Preparing Test Items
Preparing test items is a very delicate activity that requires a great deal of care and
expertise. It is a dangerously oversimplified belief that every teacher is able to write
reasonably acceptable items.
c. Reviewing Test Items
It is a generally accepted principle that test construction is a collaborative activity.
An individual, no matter how expert he may be, is potentially subject to making
mistakes. Therefore, to minimize the pitfalls, test items should be reviewed by a
team of experts. These experts would critically examine the correspondence
between test content and the table of specifications. They would also consider
the form, level of difficulty, and the appropriateness of the items. After reviewing
the items, the experts would offer some subjective comments for modifications of
these items. After the test developer made necessary modifications, the first draft
of the test would be ready to go under the scrutiny of the pre-testing step.
d. Revising Test Items
This is the step where the test items are revised before testing in the pre-testing
step. In this step, the item which has already been corrected or reviewed, are re-
write. So that, in this step, the test item is called as semi final form.

2. Criteria of a Good Test


1. Relevance – extent to which it is necessary that students are able to perform task
2.Representativity – extent to which task represents a real situation.
3.Authencity – extent to which the situation and the interaction are meaningful and
representative in the world of the individual user
4. Balance- extent to which each relevant topic/ability receives an equal amount of
attention
5. Validity – extent to which the test effectively measures what it is intended to
measure.
6. Reliability – refers to the consistency and stability with which a test measures
performance.
3. Importance of Quantitative Analysis
QUANTITATIVE ANALYSIS - the process of discovering useful information by evaluating
data whereas quantitative data analysis can be defined as the process of analyzing data
that is number-based or data that can easily be converted into numbers.
a. Finds out proficiency level English language.
b. Finds out the effectiveness of different types of approach, methods and techniques.
c. Finds out the impact of examination on classroom teaching and learning.
d. It tells us what percent of any group responded to a particular feature or
intervention as compared to groups that did not receive the particular feature or
intervention.
e. It addresses the test validity to some extent which is the core concept of testing and
assessment.

4. Common Statistical Formulas


A. Descriptive statistics - are intended to offer a general idea about the test scores.
B. Correlations - illustrated by scatter plots which are similar to line graphs in that they
use horizontal and vertical axes to plot data points; serving a very specific purpose.
C. Item reliability - indicates the discriminatory potential of a test item.

Mean x¯=∑xn
Median If n is odd, then M = ( n + 1 2 ) t h term If n is even, then M = ( n 2 )
thterm+(n2+1)thterm2
Mode The value which occurs most frequently
Variance σ2=∑(x−x¯)2n
Standard Deviation S=σ=∑(x−x¯)2n

5. Purpose and Categories of Approaches of Qualitative Analysis


Purpose
- NOT all topics of language nor literature can be measured statistically.
Viewpoints, actions and characteristics can’t always be represented numerically
and so need a qualitative approach.

Categories of Approaches
- Reflection - This approach is aimed an insights into the thinking processes and
opinions of the test taker.
- Verbal Reports - Verbal reports or verbal protocols are a way of collecting data.
They offer an insight into the thought processes of informants.
Diary Studies
Informants keep a diary which allows researchers to get an insight into their
thoughts. Diaries are not often used in test validation research, but they have
proven thei worth in research into learning process.

6. Techniques for Monitoring Student Progress


1.Curriculum Based Monitoring Tests
Each child is tested briefly each week. The tests generally last from 1 to 5
minutes. The teacher counts the number of correct and incorrect responses made in the
time allotted to find the child's score. For example, in reading, the child may be asked to
read aloud for one minute.2. 2. Observation and Interaction
Teacher observations should be direct, intentional and systematic. Besides, it is
essential to capture the events of the classroom as accurately and objectively as
possible and not only to make a record of impressions.
3. Frequent evaluations
When you only give one test at the end of the year or semester, and the student
has bad grades, he will be demotivated and think that he just doesn’t get “it”. When
giving more evaluations, students have the opportunity to grow. One bad test will not
have that much of an impact on motivation when the others are better and when the
student knows he can still catch up.
4. Formative assessment
Refers to a wide variety of methods what teachers use to conduct in-process
evaluations of student comprehension, learning needs, and academic progress during a
lesson, unit, or course.

You might also like