Local Media6441198221213860014

You might also like

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 140

he

T
Assessment
of Learning
COMPILED & prepared BY: Sir Elmer
Cuevas
CHAPTER I
EVALUATION
MEASUREMENT
TEST AND
TESTING
ASSESSMENT
Evaluation
- involves a qualitative judgment to
see whether the outcomes of a
program are satisfactory with
reference to its objectives
Measurement
– process of determining the extent,
dimensions or quantity of something
(quantifying)
Test and Testing
Test – device or tool used to obtain data in
measurement
Testing – is the administration of the assessment
tool and use the test results to determine whether
the learners can be promoted or retained and will
undergo a restudy of the same lesson.
In summary. ..
*The score from a test represents the
measurement, the interpretation of the
meaning and value of that score
represents evaluation.*
Assessment
– process of documenting, usually in measurable
terms
- Collecting of data based on the performance,
analyzing and interpreting the data of Rubric
evaluation by using statistical techniques to arrive
at valid results.
•1. PLACEMENT ASSESSMENT – to determine the level of
the student
•2. DIAGNOSTIC – conducted before instruction to
determine the entry performance
•3. FORMATIVE (METACOGNITIVE) – conducted during
instruction to obtain ongoing feedback
•4. SUMMATIVE – conducted after instruction to assess
attainment of objectives

TYPES/PHASES/ROLES OF ASSESSMENT:
What is Assessment?
The word ‘assess’ comes from the Latin
verb ‘assidere’ meaning ‘to sit with’.
ACTIVITY:
(1) Where am I going?
(2) Where am I now?
(3) How can I close the
gap?
These three types of assessment comprise the
Teaching/ Learning Road Map:
 Diagnostic assessment shows us where to begin
the journey.
 Formative assessment tells us if we’re on the right
path.
 Summative assessment shows us how far we’ve
come.
GUIDING PRINCIPLES
in the
Assessment of Learning
1. Assessment of
Learning is an integral
part of the teaching
and learning process
• Shall we do corrective measures,
remedial?
• Shall we proceed to the next
competency?
• Ultimate purpose: Check the
students’ understanding
What is the Assessment
Process?
Adjustments Aims

Assessment Action
2. Assessment tool
should match with
performance objective
Performance Objective
=
Assessment tool
=
Valid
Three domains of the
Instructional Objectives:
1. Cognitive
2. Psychomotor
3. Affective
3. The results of
assessment must be
fed back to the
learners
• We let our students know whenever
they have attained the set learning
objectives by giving them the results
of the assessment
4. In assessing learning,
teachers must consider
learners’ learning styles and
multiple intelligences and so
must come up with a variety
of ways of assessing learning
5. To contribute to the building
of the culture of success in the
school, it is pedagogically
sound that in our assessment
techniques we give some
positive points along with not
so good ones
• Starting your critical evaluation on a
performance or project by accentuating on
positive points and giving in the form of
suggestions those not-so-good points that
definitely need improvement will cushion
the impact of our critical evaluation
• Putting comments like “nicely put,
well done, fine idea, good point” on
students’ papers boost their ego and
add to their level of confidence
6. Emphasize on
self–assessment
• “Assessment should not force
students to compete against one
another; any competition should be
between students and their own prior
performance.”
• If learning is a personal process, then
the pupil or student is in the best
position to measure his or her
performance or progress against the
benchmark.
• Their self-assessment coupled with
our assessment to them may give a
more complete or adequate picture or
of how far or close they are to the
established criterion of success
7. If we believe that our task as
teachers is to teach all
pupils/students, and that it is
possible that all students, even those
from limited backgrounds, will have
access to opportunities and therefore
can achieve, then the bell curve
mentality must be abandoned
“bell curve
mentality must be
abandoned!”
8. Assessment of
Learning should never
be used as
punishment or as a
disciplinary measure
• Don’t give quizzes only because
students are noisy or misbehaving
• If we do those, students may find
assessment as punishment
9. Results of learning
assessment must be
communicated
regularly and clearly
to parents
• Let the parents know their children’s
performances
10. Emphasize on real
world application that
favors realistic
performance over out-
of-context drill items.
11. To ensure learning,
do formative
assessment
• Formative examinations are given
before, during or after the lesson, but
generally DURING instruction.
• Summative examinations are given at
the end of each unit or certain period
of time.
12. To ensure reliability
of assessment results,
make use of multiple
sources
• Make use of written tests,
performance tests, portfolios and
observation
How to make a good
assessment?
QUIZ
 Directions: Answer the following with at least 2-3
sentences only. Your answer will be graded as follows:
 Clarity and organization of thoughts – 30%;
 Content and accuracy of the answers from the expected
answers – 50%;
 and Quality of ideas and grammar – 20%)
 A total of 5 points per item will be given as the highest
possible score.
Self-evaluation
Where would you place your assessment practice on the
following continuum?

The main focus is on:

Quantity of work/Presentation Quality of learning


Advice for
Marking/Grading improvement

Identifying individual
Comparing students progress
How to make a good
assessment?
FACTORS INHIBITING ASSESSMENT
• A tendency for teachers to assess quantity and
presentation of work rather than quality of learning.
• Greater attention given to marking and grading,
much of it tending to lower self esteem of students,
rather than providing advice for improvement.
• A strong emphasis on comparing students with each
other, which demoralizes the less successful learners.
ASSESSMEN
T
AS, OF, FOR
ASSESSMENT FOR LEARNING
- involves teachers using evidence about students'
knowledge, understanding and skills to inform
their teaching.
- Sometimes referred to as ‘formative
assessment', it usually occurs throughout the
teaching and learning process to clarify student
learning and understanding.
• reflects a view of learning in which assessment
helps students learn better, rather than just
achieve a better mark.
• involves formal and informal assessment
activities as part of learning and to inform the
planning of future learning.
• includes clear goals for the learning activity
• provides effective feedback that motivates the
learner and can lead to improvement
• reflects a belief that all students can improve
• encourages self-assessment and peer
assessment as part of the regular classroom
routines
• involves teachers, students and parents
reflecting on evidence
• is inclusive of all learners.
• Assessment for Learning happens during the
learning, often more than once, rather than at the
end.
• Students understand exactly what they are to
learn, what is expected of them and are given
feedback and advice on how to improve their
work.
ASSESSMENT AS LEARNING
- sometimes referred to as ‘metacognitive
assessment’
- occurs when students are their own assessors
- students monitor their own learning, ask
questions and use a range of strategies to
decide what they know and can do, and how to
use assessment information for new learning.
• encourages students to take responsibility for their
own learning
• requires students to ask questions about their learning
• involves teachers and students creating learning goals
to encourage growth and development
• provides ways for students to use formal and informal
feedback and self-assessment to help them
understand the next steps in learning
• encourages peer assessment, self-assessment and
reflection.
• Through this process students are able to learn about
themselves as learners and become aware of how
they learn – become metacognitive (knowledge of
one’s own thought processes).
• Assessment as learning helps students to take more
responsibility for their own learning and monitoring
future directions.
ASSESSMENT OF LEARNING
- assists teachers in using evidence of student learning
to assess achievement against outcomes and
standards.
- Sometimes referred to as ‘summative assessment',
it usually occurs at defined key points during a
teaching work or at the end of a unit, term or semester,
and may be used to rank or grade students.
• is used to plan future learning goals and
pathways for students
• provides evidence of achievement to the wider
community, including parents, educators, the
students themselves and outside groups
• provides a transparent interpretation across all
audiences.
• “Assessment of Learning is the assessment that
becomes public and results in statements or
symbols about how well students are learning.
It often contributes to pivotal decisions that will
affect students’ futures. It is important, then, that
the underlying logic and measurement of
assessment of learning be credible and
defensible.”
FOR AS OF
Nature of Diagnostic and/or Formative/ Summative
Assessment Formative Metacognitive

Used by: Teachers & Students Students Teachers

(1) to determine what (1) Self-assessment (1) provide evidence


students already know (2) Peer-assessment of achievement to
Use of and can do (3) Self reflect students, parents
Information (2) to monitor students’ and educators
progress towards the
expectations
(3) scaffold next steps for
instruction and
assessment
FOR AS OF
When it Before and during During instruction After instruction
usually instruction
happens?
(1) Pre test; diagnostic test, (1) Self-assessment (1) Long quiz, unit
readiness test (2) Peer-assessment test, periodical
Example (2) Recitations, short quiz, (3) Self reflections exams,
assessment short response essay achievement test,
tool standardized test
TRADITIONAL AND
NON-TRADITIONAL
ASSESSMENT
ASSESSMENT

TRADITIONAL NON-TRADITIONAL
Testing (having test) Alternative assessment
Paper-pencil test Performance
Multiple choice type Supply
Single correct answer Multiple or many correct answers
(Questions are Convergent in nature) (Questions are Divergent in nature)
Summative Formative
Outcomes only Process AND Outcome
Skill-focused (lots of restrictions) Task-based (promotes self-paced activity)
Isolated facts Application of knowledge
(only some were put into practice and its practical
use)
External Evaluation Student Self-evaluation
The
GARDEN
ANALOGY
SCOPE OF
ASSESSMENT
1. Curricular offerings 5. Graduates
2. School programs 6. School managers
3. Instructional 7. Research
materials 8. Student/students
4. Instructional 9. Teachers
facilities 10. Extensions
Functions of Assessment:
Main – gauge learner performance/improvement (betterment)
Others
– to maintain standard
- To select students
- To motivate learning
- To guide learning
- To furnish instruction
- To appraise instrumentalities
Functions of Assessment:
1. Measures students’ achievements
2. Evaluates instruction and teaching strategies
3. Assesses lessons to be re-taught
4. Evaluates school’s programs
5. Motivates learning
Functions of Assessment:
6. Predicts success or failure
7. Evaluates school’s facilities and resources
8. Evaluates school managers’ performance
9. Diagnoses the nature of difficulties
10. Evaluates teacher’s performance
FUNCTIONS OF
ASSESSMENT
1. MEASURES STUDENT’S
ACHIEVEMENTS

• By giving test and assessing the results


we could determined whether they have
attained or reached the learning goals or
not.
2. EVALUATES INSTRUCTION
AND TEACHING STRATEGIES
• Results of the test determine whether
the teaching instruction and teaching
strategies are effective or not.
• If not effective, revisiting and considering
changes in the said matter follows.
3. ASSESSES LESSONS TO BE
RE-TAUGHT
• If teaching is ineffective as evidenced by
the poor test results, and as supported
in the results of the difficulty and
discrimination index of an item on a
particular topic, hence, that lesson must
be re-taught.
4. EVALUATES SCHOOL’S
PROGRAM

• If they are relevant, realistic, and


responsive to the needs of the society.
• Matter of employability.
5. MOTIVATES LEARNING

• The individual make use of the


assessment result, regardless if it’s low
or high, to improve themselves.
6. PREDICTS SUCCESS OR
FAILURE

• As early as the very beginning of the


educative process, if the students seems
struggling and barely got a passing
mark, most likely he is at risk to fail.
7. DIAGNOSES THE NATURE OF
DIFFICULTIES

• It helps in determining what part or


subject the student is weak or having
difficulties.
8. EVALUATES TEACHERS’
PERFORMANCE

• Teachers’ effectiveness reflects from the


performance of the students.
9. EVALUATES SCHOOLS’
FACILITIES AND RESOURCES

• Adequacy and inadequacy of school’s


facilities and resources has a big
contribution on the achievement of
students and graduates.
10. EVALUATES SCHOOL
MANAGERS PERFORMANCE
CHAPTER II
CHARACTERISTI
CS OF
ASSESSMENT
METHOD
/
VALIDITY RELIABILITY PRACTICABILITY JUSTNESS MORALITY IN
ASSESSMENT

CHARACTERISTICS OF AN
ASSESSMENT TOOL
VALIDITY
-Means the degree to which a test
measures what it intends to measure or
the truthfulness of the response.
-Validity of test concerns what the test
measures and how well it does so.
VALIDITY
-A valid test is always valid
TYPES OF
VALIDITY
1. CONTENT VALIDITY
 The content or topic is truly the
representative of the course.
 A well-constructed test must cover
the objective of instruction, not just the
subject matter.
2. CRITERION VALIDITY
The degree to which the test agrees or
correlates with a criterion set up as acceptable
measure.
Measures how well one measure predicts an
outcome for another measure. A test has this
type of validity if it is useful for predicting
performance or behavior in another situation
(past, present, or future).
Example:
 A job applicant takes a performance test
during the interview process. If this test
accurately predicts how well the employee
will perform on the job, the test is said to
have criterion validity.
2.1. TYPES OF CRITERION
RELATED VALIDITY
Postdictive Validity
Concurrent Validity
Predictive Validity
3. CONSTRUCT VALIDITY
 Extent to which the test measures
theoretical trait. It measures what it
intends to measure.
4. FACE VALIDITY
also called logical validity
easiest form of validity
often criticized as the weakest form of
validity
RELIABILITY
-Means the extent to which a test is
consistent and dependable. In other
words, the test agrees with itself.
-Concerned with consistency of
responses from moment to moment.
RELIABILITY
-A reliable test may not always
be valid.
CLASSIFICATION
OF ASSESSMENT
TOOLS
ACCORDING TO
TYPE OF LANGUAGE/SYSTEM USED
• VERBAL – Uses words and my be oral or written
• NON-VERBAL – Uses symbols, figures, numbers
and may be oral or written; Ex. Abstract reasoning
ACCORDING TO ORIGIN
•ORAL TEST – requires verbal answers
•WRITTEN TEST – pencil and paper test;
requires printed answers
•PERFORMANCE TEST – non-verbal, non-
written; measures motor skills
ACCORDING TO
MANNER OF SCORING
• OBJECTIVE – answers are chosen from given
options and corrected uniformly whoever the scorer
may be; ex. T-F, multiple choice, matching type
• SUBJECTIVE – scorer brings in his personal
judgment since the question is not specific; ex. Essay
ACCORDING TO
MANNER OF CONSTRUCTION
• Structured – questions are given within a
framework; ex. T-F, Multiple Choice
• Unstructured – does not follow prescribed
framework; ex. Essay, projective test
ACCORDING TO MANNER OF
ADMINISTRATION
• Individual test – given to one examinee at a
time
• Group test – given to many examinees at a
time
ACCORDING TO MANNER OF
ADMINISTRATION
• Timed test – under time pressure
• Untimed test – no time limit
ACCORDING TO
MANNER OF DESCRIBING
PERFORMANCE OF EXAMINEES
• Criterion-referenced test – describes performance
of examinee directly without referring to performance of
others but compared to a given criteria
ACCORDING TO
MANNER OF DESCRIBING
PERFORMANCE OF EXAMINEES
• Norm-referenced test – measures performance of
examinee in comparison to a standard or norm (others’)
• Ipsative - is the practice of assessing present
performance against the prior performance of the
person being assessed.
ACCORDING TO
FUNCTION OR PURPOSE
• Achievement test – measures outcomes of teaching,
accomplishment of student’s school work in a given
period of time; measures general educational standing
• Aptitude test - measures future success in a given
area
ACCORDING TO
FUNCTION OR PURPOSE
• Intelligence/mental ability test – measures verbal,
numerical and abstract ability of a person in
comparison to another of the same sex, age or grade
• Personality test – measures
traits/behavior/interest/attitudes of a person
ACCORDING TO
FUNCTION OR PURPOSE
• Diagnostic test – measures strengths and
weaknesses in particular area
• Physical test – demands manual dexterity and skill
• Readiness test – determines readiness to learn and
level of preparation
ACCORDING TO
FUNCTION OR PURPOSE
• Power test – measures level of maximum ability
without any time limit
• Speed test – measures level of maximum ability with
time limit
• Scale test – items are arranged according to degree of
difficulty
ACCORDING TO
FUNCTION OR PURPOSE
• Projective test – indirect measure of personality traits
and innermost thoughts and feelings; ex. Ink-blot test
• Psychological test – measures person’s ability or
personality as developed by general experience
• Simulated test – makes a pretense or appearance of a
certain activity
ACCORDING TO
FUNCTION OR PURPOSE
• Formative test – evaluates student comprehension of
a particular lesson/topic; measures student progress
• Summative test – given at the end of instruction
• Placement test – determines the level of a child
CHAPTER
III
TABLE OF
SPECIFICATIONS
GUIDE QUESTIONS:

• What is TOS or Table of Specifications?


• What is its importance?
• Is it always needed?
• How is TOS constructed?
TABLE OF SPECIFICATIONS

• is a test plan or a test


blueprint.
TABLE OF SPECIFICATIONS
• includes the objectives/topics,
number of test items, and the
number of days a topic was taught.
IMPORTANCE OF TABLE OF
SPECIFICATIONS
• It is used to help teachers
construct a test easily and ensure
validity as well.
TABLE OF SPECIFICATIONS

• Not all schools require TOS but


public schools always do.
HOW TO MAKE A TABLE OF
SPECIFICATIONS?
SAMPLE FORMAT OF TOS:

TOPIC LEARNING TAXONOMY NO. OF DAYS NO. OF ITEMS PLACEMENT/


COMPETENCIES/ LEVEL TAUGHT FOR EACH SPECIFIC
OBJECTIVES TOPIC/OBJECTIV NUMBER IN
E THE TEST
REMEMBER:
• Provide the topics/objectives you had for a
specific grading period.

• Identify if what level in Bloom’s Taxonomy of


Objectives each objective is.
BLOOM’S
TAXONOMY OF
Evaluation
COGNITIVE
Synthesis DOMAIN
Analysis

Application

Comprehension

Knowledge
REMEMBER:
• Specify the number of days each topic was
taught.
• The number of items for each topic/objective
refers to how many questions about that
topic/objective should be put in the test.
REMEMBER:
• To determine it, you have to divide the number
of days each topic was taught by the total
number of days all the topics were taught
multiply by the total number of test items.
SAMPLE FORMAT OF TOS:
TOPIC LEARNING TAXONOMY NO. OF DAYS NO. OF ITEMS FOR PLACEMENT/
COMPETENCIES/O LEVEL TAUGHT EACH SPECIFIC
BJECTIVES TOPIC/OBJECTIVE NUMBER IN
THE TEST

Figure of To identify what Knowledge and


speech hyperbole and recall 1 3
personification are

TOTAL
17 (total number
of days all the 50 (total number 1/17=0.06
topics were of test items) 0.06x50= 3
taught)
REMEMBER:
• In case that the number of test items is 50 but
when you add the numbers in column 5 the result
is 51, you have to adjust the number of items for
each topic/objective. Example, instead of 3
items, you can make it 2 items only.
REMEMBER:
• The placement/specific number in the test refers
to what specific number in the examination are
the questions for each topic or objective
assigned.

For example:
SAMPLE FORMAT OF TOS:
TOPIC LEARNING TAXONOMY NO. OF DAYS NO. OF ITEMS FOR PLACEMENT/
COMPETENCIES/O LEVEL TAUGHT EACH SPECIFIC
BJECTIVES TOPIC/OBJECTIVE NUMBER IN
THE TEST

Figure of To identify what Knowledge and 1 3 1-2


speech hyperbole and recall 25
personification are

TOTAL 17 (total number 50


of days all the
topics were (total number of test
taught) items)
REMEMBER:
• The placement/specific number in the test
column should match with the number of items
for each topic/objective column.
ANSWER:

• What is TOS or Table of Specifications?


• What is its importance?
• Is it always needed?
• How is TOS constructed?
-END-
Let’s have some practice!
Ethics in Assessment
 confidentiality of assessment
data
 sensitive questions
 harm
 deception
GENERAL CLASSIFICATIONS
OF ASSESSMENT:

1. traditional assessment
2. alternative assessment
 performance-based
 project-based
 portfolio
Assessing
Learning Outcomes
Goals
vs.
Objectives
GOALS OBJECTIVES
broad narrow

general intention precise

intangible tangible

abstract concrete

long-term short-term
hard to put in a
with timeline
timeline
Purposes of Instructional
Goals and Objectives
1. It provides directions for the
instructional process by
clarifying the intended learning
outcomes.
Purposes of Instructional
Goals and Objectives
2. It conveys instructional intent
to other stakeholders such as
students, parents, school
officials, and the public.
4 Main Things That
Objectives Should Specify
1. audience
2. observable behaviour
3. special conditions
4. stating criterion level

You might also like