Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 23

ASSESSMENT EDUCATION 2.

Criterion-referenced evaluation – performance is


compared against a standard/criteria
Measurement – quantification; numbers
Assessment – gathered information TYPES OF EVALUATION
Evaluation – judgments 1. Diagnostic evaluation (Pre-assessment) – before
Test – tool; instrument instruction to assess prior knowledge of a particular topic
---------------------------------------------------------------------- or lesson.
3 GENERAL TYPES OF CLASSROOM ASSESSMENT 2. Formative evaluation – during instructional process to
1. Official assessment – graded; mainly cognitive provide feedback to students and teachers on how well
2. Sizing up assessment – social, academic, and students are learning the lesson being taught; pretests,
behavioral; personality profile; beginning homework, seatwork, and classroom questions.
3. Instructional assessment – teaching & learning; daily;
lessons to teach, strategy and instructional materials. Results are neither recorded, nor graded but are used for
modifying or adjusting instruction.
----------------------------------------------------------------------
3._Summative evaluation – after instruction;
2 BASIC METHODS OF COLLECTING ASSESSMENT DATA achievement for grading purposes; mostly based on
1. Paper-and-pencil technique cognitive knowledge, as expressed through test scores and
a. Supply type – students provide the answer; Book written outputs.
report, essay question, class project, and journal Ex: chapter/unit/periodical/achievement tests, and graded
entry. homework/project; Results is not only for judging student
b. Selection type – students choose correct answer; achievement but also for judging the effectiveness of the
Multiple choice, matching test, alternate response test teacher and the curriculum.
2. Observation – students perform
Test – formal and systematic procedure for measuring
knowledge, skills, and values;
SOURCES OF EVALUATIVE INFORMATION
USES OF TESTS
1. Cumulative Record – all info collected on students
over years; stored in principal's or guidance office; vital 1. School administrators - promotion or retention of
statistics, academic records, health records, family data; students; curriculum improvement;
aptitude tests, intelligence, and achievement; anecdotal 2. Supervisors - identifying teachers' weaknesses and
and behavioral comments from-previous teachers.
learning competencies not mastered by students;
2.-Personal Contact – teacher's daily interactions,
baseline data on curriculum revision.
observation, conversation; Oral Reading; Answering
Questions; Following Directions; Seatwork; Interest in the 3. Teachers - effectiveness of instruction; give feedback
Subject to students about their progress; and assign grades.
3.-Analysis – student errors; learning difficulties; file 4. Parents - how well their children are faring in school
samples of students' work for discussion during parent- and how well the school is doing its share in educating
teacher conferences. their children.
4. Open-ended Themes and Diaries – students write
about their ideas, concerns, and feelings; Analysis of TYPES OF TESTS
students' diaries gives valuable evaluative information.
1. As to mode of response
5. Conferences – w/ parents, previous teachers &
a. Oral Test
Guidance counselors; info why students are experiencing
b. Written Test
academic problems, difficulties; techniques in correcting
c. Performance Test
them.
6. Testing – measure cognitive achievement, attitudes,
2. As to ease of quantification of response
values, feelings, and motor skills; most common
a. Objective Test — paper and pencil test; requires
measurement technique
specific response: convergent
b. Subjective Test — is a paper-and-pencil test; not
2 APPROACHES TO EVALUATION easily quantified; freedom to write their answer;
such as an essay test: divergent.
1.-Norm-referenced evaluation – compared w/
performance of others students who took the same 3. As to mode of administration
examination. a. Individual Test — one student at a time.
b. Group Test — group of students simultaneously.
3. How Long the Test. Should Be.
4. How Best to Prepare Students For Testing.
4. As to test constructor
a. Standardized Test — prepared by an expert or
specialist; uniformed procedures/time limits.
ASSESSING COGNITIVE LEARNING
Results are interpreted based on specified norms
or standards. A. OBJECTIVE TEST - only one answer to each item.
b. Unstandardized Test — prepared by teachers for
use in the classroom. 1. Supply Types of Objective Test - student constructs
his/her own answer to each question.
5. As to the mode of interpreting results a. Completion Drawing Type - incomplete drawing; to
a. Norm-referenced Test — compares to be completed
performance of group on the same test. b. Completion Statement Type - incomplete sentence to
b. Criterion-referenced Test — measures be completed by filling in the blank.
performance against an agreed upon or pre- c. Correction Type - sentence w/ underlined word or
established level of performance. phrase; has to replace to make it right.
d. Identification Type - brief description; has to identify
6. As to the nature of the answer what it is.
a. Personality Test — emotional and social e. Simple Recall Type - direct question; answer using a
adjustment; dominance and submission; value word or phrase.
orientation; disposition; emotional stability; f. Short Explanation Type - similar to an essay test but
frustration level; and degree of introversion or requires a shorter answer.
extroversion.
b. Intelligence Test — mental ability 2. Selection Types of Objective Test
c. Aptitude Test — likelihood of an 'individual's a. Arrangement Type — terms or objects; to be arranged
success in a learning area or field. in a specified order.
d. Achievement Test — what has learned b. Matching Type — list of numbered items related to a
e. Diagnostic Test — specific strengths and list of lettered choices.
weaknesses in past and present learning. c. Multiple Choice Type — question, problem or
f. Formative Test — given to improve teaching and unfinished sentence followed by several responses.
learning while it is going on; ex: test after d. Alternate Response Type — only two possible
teaching the lesson for the day answers to the question; true-false, agree-disagree.
g. Summative Test — given at the end of instruction e. Key List Test — concepts based on a specified set of
to determine learning and assign grades. criteria
h. Socio-metric Test — likes and dislikes, f. Interpretive Exercise — a form of multiple choice
preferences, social acceptance, social type; provides some information or data followed by a
relationships existing in a group. series of questions on that information; students have
i. Trade Test — skill or competence in an to analyze, interpret, or apply the material provided,
occupation or vocation. like a map, excerpt of a story, passage of a poem, data
matrix, table or cartoon.
ASSESSMENT OF LEARNING IN THE COGNITIVE
DOMAIN B. ESSAY TEST – presents a problem or question; student
has to compose a response in paragraph form, using own
Benjamin Bloom’s Taxonomy words, and ideas.
1. Knowledge: recognizing and remembering - Lowest a. Brief or Restricted Essay Test — requires limited
2. Comprehension: understanding writing; given problem be solved in a few sentences
3. Application Level: do or apply what have learned b. Extended Essay Test —requires to present answer in
4. Analysis Level: Critical thinking; Pagsusuri several paragraphs or pages of writing; more freedom
5. Synthesis Level: Creative thinking; combining; to express ideas and opinions
building
ATTRIBUTES OF A GOOD TEST AS AN ASSESSMENT TOOL
6. Evaluation Level: judging the value or worth -
Highest 1. Validity — a test measures what it seeks to measure.
2. Reliability — accuracy/consistently; A test is reliable
PREPARING FOR ASSESSMENT OF COGNITIVE LEARNING
if it produces similar results when used repeatedly.
1. What Should Be Tested. A test may be reliable but not necessarily valid. On
2. How to Gather Information About What to Test. the other hand, a valid test is always a reliable one.
3. Objectivity — extent to w/c personal biases or pairs of matching phrases, words or other related facts
subjective judgment is eliminated. from separate lists; arrangement of a set of multiple-
4. Scorability — easy to score or check as answer key choice items with all stems, called premises, having the
and answer sheet are provided. same set of possible alternative answers.
5. Administrability — easy to administer; clear
instructions are provided to students, proctors, and
scorers.
COMPLETION ITEMS
6. Relevance —test item should be directly related to
contains a blank, w/c must fill in correctly with one word
course objectives and actual instruction. Relevance is
or a short phrase; blank should be placed near or at the end
considered a major contributor to test validity. of the sentence.
7. Balance — degree to w/c the proportion of items
testing particular outcomes corresponds to the ideal ARRANGEMENT ITEMS
test. The framework of the test is outlined by a table used for testing knowledge of sequence and order;
of specifications. alphabetical arrangement, chronological events, steps,
8. Efficiency — number of meaningful responses per jumbled words , letters in a word, ect.
unit of time. Compromise has to be made among the
available time for testing, scoring, and relevance. COMPLETION-DRAWING ITEMS
9. Difficulty — appropriate level to the group being incomplete drawing is presented w/c has to be
tested; norm-referenced - reliable if each item is completed.
passed by half of the students; criterion-referenced -
CORRECTION ITEMS
difficulty can be judged relative to the percentage
similar to the completion item, except that some words or
passing before and after instruction. phrases have to be changed to make the sentence correct.
10. Discrimination — norm-referenced - ability of an
item to discriminate is generally indexed by the IDENTIFICATION ITEMS
difference between the proportion of good and poor an unknown specimen is to be identified by name or other
students who respond correctly; criterion-referenced - criterion.
pretest and posttest differences of the ability of the test
or item to distinguish competent from less competent ENUMERATION ITEMS
students. has to list down parts or elements/ components of a given
11. Fairness —equal chance to demonstrate their concept or topic.
knowledge or skills ANALOGY ITEMS
consists of a pair of words, which are related to each
TABLE OF SPECIFICATIONS (TOS) - teacher's blueprint other; measuring the skill in sensing association between
in constructing a test; valuable to teachers for 2 reasons: paired words or concepts.
First, it helps teachers decide on what to include/leave-out
INTERPRETIVE TEST ITEM
in a test. Second, it helps them determine how much often used in testing higher cognitive behavior; may
weight to give for each topic covered and objective to be involve analysis of maps, figures, or charts or even
tested. comprehension of written passages.
TEST ITEMS SHORT EXPLANATION ITEMS
similar to essay test but requires a short response, usually
a sentence or two.
MULTIPLE-CHOICE ITEMS
most widely used form of the test item; composed of: GENERAL TYPES OF ESSAY ITEMS
stem - problem or question
alternative responses – options; one is correct answer, 1. Extended Response Essay Item – allows in-depth
other alternatives are distractors or foils. sampling of students knowledge, thinking processes, and
problem-solving behavior relative to a specific fact.
ALTERNATE-RESPONSE ITEMS
2. Restricted Response Essay Item – required to provide
only two possible answers to the stem; True-False / Yes-
limited response based on a specified criterion for
No / Wright-Wrong, and agree-disagree items.
answering the question.
MATCHING ITEMS 2 WAYS OF SCORING ESSAY TESTS
1. Holistic Scoring
graded based on teacher’s general impression or over-all An item has a positive discriminating power when more
assessment; classified into: outstanding; very satisfactory; students from the upper group got the right answer than
fair; and poor. those from the lower group.
2. Analytic Scoring
scored in terms of each components; separate points for When more students from the lower group got the correct
organization of idea, grammar and spelling; and answer on an item than those from the upper group, the
supporting arguments or proofs item has a negative discriminating power.
zero discriminating power - happens when equal number
ADMINISTERING THE TEST from upper and lower groups got the right answer to an
item; Items w/ negative and zero discriminating powers -
Test administration is concerned w/ the physical and
discared
psychological setting in which students take the test.
Average Difficulty + Positive Discriminating power = Retained
1. Provide a quiet and comfortable setting. Easy / Difficult = Revise/modify
2. Anticipate questions that students may ask. Very Easy / Very Difficult = Rejected
3. Set a proper atmosphere for testing Negative or 0 DP = Rejected
4. Discourage cheating
Examining Distractor Effectiveness
3 MAIN ELEMENTS IN AN ITEM ANALYSIS
1. examination of the difficulty level of the items Ideal item - all students in the upper group answer
2. determination of the discriminating power of each correctly and all students in the low group answer
item, wrongly; responses of the lower group have to be evenly
3. examination of the effectiveness of distractors in a distributed among the incorrect alternatives.
multiple choice or matching items.
Good distractors - chosen more frequently by the lower
index of difficulty – difficulty level of an item; group. When a particular distractor is selected more
percentage of students answering correctly each item in frequently by those from the upper group, the teacher has
the test. to revise it; have lower mean scores than those related to
index of discrimination – percentage of high-scoring the correct response
individuals responding correctly versus the number of low distractors chosen by no one in either of the two groups of
scoring individuals responding correctly to an item. students should be revised to make them more useful.
numerical index indicates how effectively an item
differentiates between the students who did well and those PURPOSES OF GRADING TEST SCORES
who did poorly on the test.
1. Informational - students' subject matter achievement
ITEM ANALYSIS 2. Administrative - decisions regarding class standing
Involves counting the number of students in high and low- 3. Motivational -encourage to exert academic effort
scoring groups responding correctly; can be done by 4. Guidance - identify who needs special services,
asking students to raise their hands-on items they correctly
tutoring and remedial instruction
answered.
METHODS OF DETERMINING GRADES
The difficulty index - represents the percentage of the
total number of students answering an item correctly. 1. Absolute Grading
graded based on the mean or average of the test scores;
The higher the value of the difficulty index, the easier is often used in colleges and universities; letters A-F, or
the item. numbers1- 5.
major assumption for the use of this method is that the majority
For an item to be considered a good item, its difficulty of students are typically of average ability. Conversely only a
index should be 50%. An item with 50% difficulty index few are above average and below average.
is neither easy nor difficult.
Range Difficulty Level
2. Relative Grading
20 & below Very Difficult Rejected also called criterion-referenced grading; scores are to
21 – 40 Difficult Revise percentage rating based on minimum requirements or
41 – 60 Average Retain
61 – 80 Easy Revise
predetermined standard, based on the teacher's judgment.
81 & Above Very Easy Rejected A transmutation table is prepared for converting scores into
Using Information About Index of Discrimination marks or grades. This type of grading is premised on the
assumption student's performance is independent of the
performance students in a class.
FORM OF GRADES This coefficient can be identified by either the letter r or
the Greek letter rho, or other symbols, depending on the
1. Letter grade – A, B, C, D, F manner the coefficient has been computed.
2. Qualitative rating - excellent, very good, good, fair,
and needs improvement. It can range from a - 1.00 or a + 1.00 toward zero
3. Numerical grade - 1, 2, 3, 4, 5; commonly used in
colleges and universities. (+/-) sign indicates the direction of the relationship and
4. Percentage rating - Grades of 70%, 75%, 85% ;based the numerical value of its strength:
on criteria for grading set by the teacher. Correlation Coefficient Degree of Relationship
.00 – .20 Negligible
.21 –.40 Low
CONVERTING SCORES TO PERCENTAGE RATING .41 –.60 Moderate
.61 –.80 Substantial
.81 –1.00 High to Very High
There are three formulas that can be used in converting
raw scores into percentage rating.
Positive correlation means one variable (X) are associated
Formula 1: R = 50
TS
TI( )
+50 with high scores
Where: R = rating in percentage
TS = Total Score
Negative correlation means that high are associated with
TI = Total Number of items low scores in another variable or vice-versa
student has to answer correctly 50% of the total number of
MEASURE OF RELATIONSHIP
test items; highest possible rating that can obtain is 100%.
PEARSON'S PRODUCT-MOMENT CORRELATION
Formula 2: R = ( )
TS
TI
+55+ 45 measure of relationship used when factors to be correlated
are both metric data; metric data – measurements, w/c can
be subjected to the four fundamental operations.
used when the teacher sets the lowest rating at 55%
instead of 50%, student has to answer correctly at least SPEARMAN RHO
45% of the test items in order to get a passing grade; used when test scores are ordinal or rank-ordered.

Formula 3: R = 40 ( TSTI )+ 60 ORGANIZING TEST SCORES


1. Ordering - numerical arrangement of numerical
student has to answer correctly 40% of the test items observations or measurements.
To consider in deciding on the lowest base grade scores to a. ascending order - lowest to highest.
percentage rating. These factors are as: b. descending order - highest to lowest.
• Difficulty of the test; Organizing Test Scores By Ranking
• The standard of the teacher; and
• The standard of the school 2. Ranking is a process of determining the relative
position of scores, measures or values based on
ROUGH METHOD OF GRADING TEST SCORES
magnitude, worth, quality, or importance.
alternative method based of test scores when many
students’scores are not within the desired percentage of 3. Stem-and-leaf plot is a method of graphically sorting
correct answers (50%, 45%, or 40%) and arranging data to reveal its distribution.
Get the highest and lowest scores; Determine the: grade you will
give the highest score and the levels of ratings to be used. 4. Frequency Distribution a table showing the number of
times a score occurs.
CORRELATING TEST SCORES
single value frequency distribution - number of scores (N) is 30
Correlation - relationship between two or more paired or less; grouped frequency distribution - N is more than 30.
factors or two or more sets of test scores.
Setting Class Boundaries and Class Marks
Correlation coefficient - numerical measure of
correlation class boundary - integral limit of a class. These integral
limits can either be apparent or real.
a. apparent limits of a class are comprised of an upper 2. VARIANCE - square of standard deviation or simply
and lower limit; in the class 36-40, the lower limit is mean square.
36 and the upper limit is 40.
3. STANDARD DEVIATION - square root variance or
b. real or exact limits of scores in a class extend from
s2; most reliable measure of estimating variability in test
one-half of the smallest unit of measurement below
scores, when average used is the mean.
the value of the score to one-half unit above.
3. QUARTILE DEVIATION or semi-interquartile range
Class mark is the midpoint of a class in a grouped
- specifically used when median is used as a measure of
frequency distribution. It is used when the potential score
central tendency
is to be represented by one value if other measures are to
be calculated. MEASURES OF RELATIVE STANDING
( ¿+UL ) descriptive measures that locate the relative position of a
CM =
2 test score in relation to the other scores obtained by a
group of students or test takers.
DERIVED FREQUENCIES FROM GROUPED
FREQUENCY DISTRIBUTION 1. QUARTILE (Q) - point measure that divides
distribution into four equal parts.
1. Relative frequency distribution indicates what percent
of scores falls within each of the classes. Relative Q1 - The first or lower quartile separates the bottom 25%
frequency is calculated following the formula: RF = from the top 75% of the scores. Thus, 25% of scores fall

( FN ) 100
below Q1.
Q2 - is the equivalent to the median,
Q3 - the third or upper quartile separates the top 25% from
2. Cumulative frequency distribution indicates the the bottom 75% of the scores. It follows that 25% of
number of scores that lie above or below a class boundary. scores fall above Q3 and 75% below it.

a. Less than cumulative frequencies - adding the 2. DECILE (D) is a point measure that divides a
successive frequencies bottom to the top of the distribution of into 10 equal parts with about 10% of
distribution. scores into each part. Conceptually, the median is
b. greater than cumulative frequencies are calculated equivalent to D5
by adding the successive frequencies from the top to the locator of the nth decile has to be changed. For D6 the
the bottom of the distribution. 6N
locator is
10
MEASURES OF CENTRAL TENDENCY
provide a single summary figure that best describes the 3. PERCENTILE (P) - divides a distribution of scores into
central location of an entire distribution of test scores 100 equal parts. The median of test scores corresponds to
P50 while the lower and upper quartiles are equal to P25
1. MEAN – arithmetic average. M and P75 respectively. DS which is equivalent to the
2. MEDIAN – middle value; not affected by extremes Md median is also equal to P50.

3. MODE - most common/recurring Mo The steps in computing are applicable in calculating other
percentiles. Only the locator has to be changed; ex: when
crude mode is an ungrouped occurring score in an array. 78 N
computing P78 the locator to be used is
True mode of both grouped and ungrouped test scores is 10
obtained by using: Mo = 3Md – 2M 4. PERCENTILE RANK (PR) - percentage of scores at
Where: Mo = mode Md = median M = mean and below a given score point .

MEASURES OF VARIATION To describe a student's performance in relation to how


other students performed in the same test can be done
1. RANGE - distance between highest and lowest score in through his percentile rank
a can either be total or absolute; most unreliable measure
of variation because it is affected by extreme scores; ideal 5. Z SCORE or standard score - used in differentiating
when precision is not an important requirement. between typical and unusual scores/ values.
In determining whether an individual score is within the 3. Predictive Validity - scores on a test can predict later
realm of the typical or is unusual, we have to consider the behavior or test scores.
difference between the score (x) and the mean (M). The 4. Concurrent Validity - relationship between scores on a
difference has to be divided by the standard deviation (s). test or scale and scores on another measure of established
x−M validity given at the same time.
This is what z score is. Z =
s
LEVELS OF LEARNING IN THE PSYCHOMOTOR DOMAIN
Values with standard scores between z = - 2.00 and z =
2.00 are typical. 3 LEVELS OF PSYCHOMOTOR LEARNING:

values with standard scores less than z = - 2.00 or greater 1. Imitation is the ability to carry out a basic rudiment of
than z = 2.00 are unusual. a skill when given directions and under supervision. At
this level the total act is not performed skillfully. Timing
Standard scores are also useful for comparing the scores and coordination of the act are not yet refined.
of different groups with different means and standard
deviations. 2. Manipulation is the ability to perform entire skill in
sequence. Conscious effort is no longer needed to perform
the skill, but complete accuracy has not been achieved yet.
3. Precision is the ability to perform an act accurately,
efficiently, and harmoniously. Complete coordination of
DEFINING AND ASSESSING TEST
the skill has been acquired; the skill has been internalized
Reliability – consistency; test will yield similar results for to such extent that it can be performed unconsciously.
the same students when administered at different times.
Based on the foregoing list of objectives, it can be noted
Reliability coefficient - measure reliability of a test w/c that these objectives range from simple reflex reactions to
can range from 0 to 1.00. complex actions, which communicate ideas or emotions to
others.
0.80 indicates high reliability;
0.40 to 0.79 fair reliability; MEASURING THE ACQUISITION OF MOTOR AND ORAL
less than 0.40 low reliability SKILLS

1. Observation of Student Performance


METHODS OF DETERMINING TEST RELIABILITY
a. Holistic observation - score or feedback based on
1. Test-Retest Method – repetition of the same test. pre-established prototypes of how an outstanding,
2. Parallel Form Method – 2 equivalent forms; modified average, or deficient performance looks; A student
3. Split-Ralf Method – divided into two equivalent whose presentation closely matches the ideal described
halves by the teacher would receive a perfect mark.
b. Atomistic or analytic - requires that a task analysis
IMPROVING TEST RELIABILITY be conducted in order to identify the major subtasks
1. Increasing the number of test items. involved in the student performance; criteria/checklist
2. Making the test of average difficulty.
3. Scoring a test objectively. 2. Evaluation of Student Products - projects
4. Setting time limits for answering the test.
TOOLS FOR MEASURING ACQUISITION OF SKILLS

VALIDITY - a test measures what it is supposed to 1. Rating scale - series of categories that is arranged in
measure; most important characteristic of a test. order of quality.
A test may be reliable, but it does not mean that it 2. Checklist - indicates the presence or absence of
measures what it is supposed to measure. specified characteristics.
FOUR FUNDAMENTAL APPROACHES TO THE
VALIDITY OF TESTS
LEVELS OF LEARNING IN THE AFFECTIVE DOMAIN
1. Content Validity - test items match appropriate content 4 LEVELS OF LEARNING IN THE AFFECTIVE
2. Construct Validity - test measures an skills/abilities it DOMAIN.
is supposed to measure.
1. Receiving involves being aware of and being willing to issues, themselves, and others. It may take any of the
freely attend to a stimulus. following forms:
2. Responding involves active participation. It involves
not only freely attending to a stimulus but also voluntarily a. log book of daily routines or activities
reacting to it in some way. It requires physical, active b. diaries
behavior. c. essays and other written compositions or themes
3. Valuing refers to voluntarily giving worth to an object, d. autobiographies.
phenomenon or stimulus. Behaviors at this level reflect a Rubric - a scoring tool that lists the criteria for a piece of
belief, appreciation, or attitude.
work, or "what counts" (for example, purpose,
4. Commitment involves building an internally consistent
organization, details, voice, and mechanics are often what
value system and freely living by it. A set of criteria is
established and applied in making choices. count in a piece of writing).
It is usually a one—or two-page document that describes
different levels of quality, from excellent to poor, for a
specific assignment. It is usually used with a relatively
complex assignment, such as a long-term project, an
essay, or a research paper. Its purposes are to give students
informative feedback about their works in progress and to
give detailed evaluations of their final products.

TOOLS TO ASSESS LEARNING IN AFFECTIVE DOMAIN SOME APPROACHES TO THE ASSESSMENT OF


1. Attitude Scale - rating scale containing statements AFFECTIVE DOMAIN
designed to gauge Students' feelings on an attitude or 1. FORCED-CHOICE SELECTION METHODS
behavior.
A forced-choice item requires the respondent to select
2. Questionnaire - requires students to examine among choices that differ in content, rather than the
themselves and react to a series of statements about their degree of favorableness or intensity; format is similar to
attitudes, feelings, and opinions. multiple-choice item or it may be a description of a
situation with associated questions, or a pair of choice
a. Checklist type - list of adjectives for describing or
evaluating something and requires them to check those that statements.
apply; subtract the number of statements checked from the You have just taken a multiple-choice test in Social Studies.
number of positive statements—checked to get the score Your teacher asked you to exchange papers so that you can
b. Semantic differential - usually a five-point scale showing check each other's papers as she reads the answers aloud. Your
polar or opposite objectives designed so that attitudes, friend slips you a note which reads. "Please change a few of my
feelings, and opinions can be measured by degrees from very answers when they are wrong. I need to pass this test.
favorable to very unfavorable; composite score is determined
by averaging the values What Do You Think You Ought to Do?
c. Likert scale - oftentimes a five-point scale that links the A. Help your friend for him to pass the test.
options “strongly agree" and "strongly disagree."; scoring is B. Check his paper in the same way you would check the paper
similar to the scoring of an attitude scale. of any of your classmates

3. Simple projective techniques - used to probe deeper 2. THE METHOD OF SUMMATED RATINGS OR LIKERT
into the students' feelings and attitudes. SCALES

students respond to statements by choosing the options


a. word association - asked to mention what comes to his/her that most closely represent their feelings about the
mind upon hearing the given word. statements.
b. unfinished sentence - partial sentences are presented and
asked to complete them with words that best express their usual response categories are: strongly agree; agree;
feeling. uncertain or undecided; disagree; and strongly disagree.
c. unfinished story - a story w/ no ending is presented w/c
have to finish or complete. Numerical weights are assigned for each of these response
categories as follows: 5 for strongly agree; 4 for agree; 3
4. self-expression techniques - students are provided the for undecided or uncertain; 2 for disagree; and 1 for
strongly disagree.
opportunity to express their emotions and views about
The composite score is divided; the higher the numerical conduct, a record or an episode in the life of the student, a
score the more positive the attitude. word picture of the student in action; the teacher's best
effort at taking a snapshot at the moment of the incident;
any narrative of events in which the student takes part as
3. THE SEMANTIC DIFFERENTIAL TECHNIQUE
to reveal something that may be significant about his
This technique is not a test procedure but a general personality.
method of obtaining ratings of concepts on a series of The main thrust of the anecdotal record is to document
bipolar adjective scales. social and emotional aspects of a learner’s growth and
adjustment. Records can also be made of other dimensions
It is designed so that attitudes, feelings, and opinions can of relevant or classroom standing of each student. As it is
be measured by degrees, from very favorable to highly collected over a period of time, the anecdotal record can
unfavorable. Shown below is an example of a semantic provide a longitudinal view of a student's growth and
differential scale. patterns of change.
4. FREE RESPONSE AND OPINIONNAIRE METHODS  to increase teacher's insights into new students'
strengths and weaknesses; and to provide counselors
The opinionnaire is used frequently in the polling method
of gathering opinion and attitude data. The use of a well- relevant data concerning students' behaviors
constructed opinionnaire can systematize the data-
gathering process and help ensure that relevant questions
are asked and all-important aspects of a problem surveyed.

Opinionnaires are of two types: close or pre-categorized Major Types of Simulation Tests
type; open or free-response type. The former is very Type Characteristics
closely related to the forced choice methods earlier Situation Examinee role-plays in lifelike settings, which may
mentioned in this chapter. Even rating scales are Tests be social or involve apparatus
frequently associated with the structured opinionnaire. It is In-Basket Examinee is presented data, like letters, records
recommended that the open-ended form of opinionnaire Tests and memoranda and asked to simulate decision-
be adopted for classroom use. making or administrative behavior. Simulates job
performance.
Work-Sample A standardized job-relevant task is presented, and
5. SIMPLE PROJECTIVE TECHNIQUES Tests performance is observed. Task is usually a
used when there is a need to explore students' feelings and duplicate of actual criterion performance, for
attitudes in order to get their covert reactions to a example, operating a key punch or data encoding
phenomenon. Problem- Such games are frequently used in business,
a. word association Solving industry, and the military to assess problem-
Games solving skills. Competition with a standard is
b. unfinished sentence
usually involved.
c. unfinished story

6. SELF-EXPRESSION TECHNIQUES Assessing Products


These techniques provide the students opportunities to While it is true that many products have physical
express their own ideas or feelings about themselves and dimensions that may be measured, like size, weight,
others. An analysis of the log book of activities, diaries, number of errors and color, a number of qualitative
autobiographies, and compositions written in class can dimensions also needs to be assessed. Such dimensions
reveal many things about an individual's attitudes and may include flavor of the cake, the composition of
beliefs. Teachers, however, should remember that they painting or the neatness of handwriting. Thus, judging the
have to keep confidential information gathered through aesthetic qualities of a product is more difficult than
self-expression techniques. assessing its physical properties or attributes.
Assessing the Quality of an Artistic Product.
Developing and Using Anecdotal Records Assessment in the artistic and aesthetic areas of human
Anecdotal record is a description of what an individual activity is quite difficult. The problem posed by the wide
does or says. It describes in concrete detail the situation in variety of relevant factors is compounded by the
which the action or comment occurs, and what others subjective nature of aesthetic standards approached
present do or say. It is a record of some significant item of systematically and directly through the use of a rating
chart.
6. Where several examiners are involved, make each one
responsible for questions on a specified part of the full
Assessing Food Products. In assessing food products, examination.
there is a need to consider both the physical and aesthetic 7. Judge students on the basis of their performance
qualities w/c will be rated. Through the use of a rating precisely defined —not in terms of a generalized
scale, a teacher can gather qualitative data w/c can be used impression of their total appearance.
in confirming more traditional information derived from 8. Use both general and specific questions but do so in
test scores. some logical order.
9. Pose questions which students with the training which
Oral Exams are similar to an oral supply or completion has preceded a particular examination can reasonably be
items where the test taker completes or supplies an answer expected to know.
for a question or series of questions posed by a test giver. 10. Do not spend a disproportionate time probing for the
The oral exam is a potentially useful technique. answer to one question. If the first several questions do
not elicit the desired response, move on to some other
The value of oral exams is obvious. While written exams matter.
assume that the examinee understands the questions, the 11. Develop some facility with several basic techniques
oral examiner can see if his question is understood. for successful oral examination creating a friendly
Moreover, the examiner can probe the depth of student's atmosphere, asking questions, and recording responses.
understanding of a topic. Such probing can give some 12. Make a written record of the student's performance at
indication of the thought processes used by the student. the time it is given. Do it without disturbing the student or
Not to overlooked is the advantage of flexibility, which disrupting the flow of the examination.
simply means that a variety of behaviors can be sampled 13. Allow students ample time to think through and make
using an oral exam. The technique of oral examination responses to questions.
allows for the testing of both generalization and specific 14. Avoid arguing with the student. Let the student make
the most of it as it is his show.
fact. More so, the examiner can observe a wide range of
reactions to different stimulus questions. If the student In conclusion, there are two requisites if an examiner is
hesitates in answering and manifests signs of stress, these interested in measuring and evaluating products or
may be considered in assessing his degree of competence. performances/ behaviors. There are samples of a product
or performance and a systematic guide for evaluation and
Despite their potential advantages, several serious
recording the evaluation of the product or performance.
weaknesses of oral exams inhibit their use. One of the
The combination of observational data collection methods
most documented weakness of the oral examination is its and rating scales can provide a useful approach to
unreliability. The difficulty of maintaining comparable gathering data that cannot be gathered practically any
standards of judgment, selective perceptions, and other way.
interpretation on the part of different examiners, and the
limited sampling of the breadth of student's knowledge,
potentially contribute to both unreliability and invalidity. CHAPTER 19: AUTHENTIC ASSESSMENT OF
Another disadvantage of oral exams is the amount of time LEARNING
necessary to conduct a thorough oral exam. The Essence of Authentic Assessment
Some principles of oral examinations Authentic assessment is a form of assessment in which
1. Use oral examinations only for the purposes for which students are asked to perform real-world tasks that
they are best suited. demonstrate meaningful application of essential
2. Prepare in advance a detailed outline of materials to be knowledge and skills. It is a process of engaging on
sampled in the examination event to the extent of writing worthy problems wherein students have to use knowledge
questions which will be asked. to come up with effective and creative performances on
3. Determine in advance how records of student tasks that are either replicas of or analogous to the kinds
performance will be kept and what weights will be of problems people face in the real world. It usually
assigned to various factors. includes a task for the students to perform and a rubric by
4. Keep questioning relevant to the purposes of the course which their performance on the task will be gauged.
or program.
5. Word questions in such a way that the students can see
the point of the question with minimum difficulty. Educators call Authentic Assessment by alternative
names, notable among which are the following:
a. Performance assessment – as learners are required  The school must, therefore, assist students in
to perform meaningful and realistic tasks; real-world becoming proficient at performing tasks they are to
or authentic tasks/ contexts. encounter when they graduate.
b. Authentic assessment – due to the fact that it is an  The school must then require students to perform
alternative way of measuring and evaluating students' meaningful tasks that replicate real world challenges
learning, which is different from the traditional form to see if they are capable of doing so. When they do,
of assessment. then the school is successful.
c. Direct assessment – since it provides a more direct
evidence of meaningful application of knowledge and In contrast with traditional assessment, assessment drives
skills. Getting a perfect score in an objective or essay the curriculum in authentic assessment. Teachers have to
test is not a guarantee that the student can apply what determine first of all what students have to perform to
he had learned in real-world context. His or her direct demonstrate their mastery, and then the curriculum is
demonstration of the application of knowledge and developed that will facilitate their performance of the
skills is the best indicator of that learning. tasks well. Thus, to assess what students had learned in
math, history, science, teachers have to ask students to
Similarities and Differences Between Authentic and perform tasks that parallels the challenges faced by
Traditional Assessments mathematicians, do historical investigations or conduct
scientific inquiries or laboratory experiments.
Traditional assessment refers to the forced-choice
measures of multiple-choice tests, fill-in-blanks, true- DEFINING ATTRIBUTES OF THE TWO TYPES OF
false, matching and the like that have been and remain so ASSESSMENT
common in schools. Students typically select an answer or
Traditional Assessment Authentic Assessment
recall information to complete the assessment. These tests Selecting A Response Performing A Task
may either be standardized or teacher-made types of Contrived Real-life
assessment. Recall/Recognition Construction/Application
Teacher-structured Direct Evidence
Both traditional and authentic assessments share the belief
that the primary mission of the school is that of assisting Selecting A Response to Performing A Task. Students
society in developing productive citizens. Nevertheless, are given several choose the right response in traditional
traditional assessment is anchored on the educational assessment. students are asked to demonstrate
philosophy that adopts the following reasoning and understanding of a meaningful application of what they
practice: learned.

 A school's mission is to develop productive citizens. Contrived to Real-life. In traditional assessment, tests
 An individual needs to acquire a certain body of offer contrived means Of demonstrating proficiency in a
short period of time, which runs counter to life outside the
knowledge and skills for him to become a productive
school: In authentic assessment, students are asked to
citizen. demonstrate proficiency by actually doing something.
 Schools are therefore, obliged to teach this body of
knowledge and skills. Recall/Recognition--of--Knowledge--to
 The school must then test students to see if they Construction/Application of Knowledge. In traditional
acquired these knowledge and skills. If they do, then assessment the ability to recall or recognize information is
the ultimate indicator of a student's acquisition of a body
the school is successful in its mission.
of knowledge, which is a poor measure of his/her
It is very clear that the curriculum drives assessment in proficiency. Nonetheless, the ability to construct a product
the traditional type of assessment. Knowledge and skills or performance out of facts, ideas and propositions can
are first identified, which become the curriculum for the reveal what the student really knows and can do when
students to complete. Assessment follows after the asked to do so. This is because authentic assessment
implementation of the curriculum to determine whether requires students to analyze, synthesize and apply what
the school is successful or not in accomplishing its they have learned in a substantial manner. Likewise, they
essential mission. Conversely, authentic assessment is are able to create a new meaning in what they do in the
rooted in the following reasoning and practice: process.
 A school's mission is to develop productive citizens. Teacher-structured to Student-structured. In traditional
 To be a productive citizen, the student must be able to assessment students demonstrate what has been prepared
perform meaningful tasks in the real world. by the teacher and therefore their attention will be focused
on and limited on the contents of the test. On the other
hand, authentic assessment enables more choice and ability to apply knowledge by asking students to use what
construction in determining what is presented as proof of they have learned in a very meaningful way. Thus, if a
proficiency. teacher wants to know if his students can interpret
literature, test hypothesis or converse in English and other
Indirect Evidence to Direct Evidence. Although students foreign languages or use other knowledge skills they have
can answer correctly questions posed by the teacher in the learned, then authentic assessment will definitely provide
quiz or summative examination, there is no clear proof the most direct evidence.
that they can meaningfully apply what they learned in
complex real-world situations. Thus, evidence of what 2. Authentic assessments capture the constructive
students can and will do in traditional assessment indirect. nature of learning.
On the contrary, authentic assessment provides a more A considerable body of research on learning has found
direct proof of application and construction of knowledge. that learners cannot; simply be fed knowledge. Learners
Asking a student to write a critique of the arguments his need to construct their own meaning of the world, using
classmate has presented can provide a very clear evidence information they have gathered and were taught and their
of his skill in critiquing than asking him a number of own experiences with the world. Thus, assessments cannot
multiple-choice questions or analytical questions about a just ask students to repeat back information they have
quoted passage. received. Students must also be asked that they have
Traditional and authentic approaches to assessments also accurately constructed meaning about what been taught.
vary in terms of adhering to the practice of teaching to the Furthermore, students must be given the opportunity in the
test. construction of meaning. Authentic tasks not only serve as
assessment but also as vehicles for such learning.
Teachers are discouraged from teaching the test under the
traditional model of assessment. This is because a test is 3. Assessments Integrate Teaching, Learning and
used in assessing a sample of students’ knowledge and Assessment.
understanding and assumes that students' performance on Authentic assessment, in contrast to the more traditional
the sample is typical of their knowledge of all relevant assessment, the integration of teaching, learning and
materials. If teachers focus on the sample to be tested assessing. In the assessment model, teaching and learning
during actual instruction, then good performance does not are often separated assessment, i.e., a test is administered
necessarily reflect knowledge of all the material. after knowledge or skills have acquired. In the authentic
Teachers, therefore, hide the test so that the sample is assessment model, the same authentic task used to
beforehand. Thus, they are warned from teaching the test. measure the students' ability to apply the knowledge or
skills is used as a vehicle for student learning. For
On the contrary, teachers are encouraged to teach the test
example, when presented with a real-world problem to
under the authentic approach to assessment. Students need
solve, students are learning in the process of developing a
to learn how to perform well on meaningful tasks. To aid
solution, teachers are facilitating the process, and the
students in that process, it is helpful to show them models
students' solutions to the problem become an assessment
of good and not so good performance. Furthermore, the
of how well the students can meaningfully apply the
student benefits from seeing the task rubric ahead of time
concepts.
as well. By knowing what good performance looks like,
and by knowing what specific characteristics make up
4. Authentic Assessments Provide Multiple Paths to
good performance, students can better develop the skills
Demonstration
and understanding necessary to perform well on these
Learners have different strengths and weaknesses on how
tasks.
they can best demonstrate what they have learned.
USING AUTHENTIC ASSESSMENT IN THE Regarding the traditional assessment model, answering
CLASSROOM multiple-choice questions shows strength of tests because
it makes sure everyone is being compared on the same
4 important reasons why teachers have to use authentic domains in the same manner which increases the
assessment methods in their classroom, side by side with consistency and comparability of the measure. But, testing
the traditional paper-and-pencil tests. favors those who are better test-takers and does not give
students any choice in how they believe they can best
1. Authentic Assessments are direct measures. demonstrate what they have learned.
Through authentic assessment, we can be assured that Thus, it is recommended that multiple and varied
students do not just know the content of the subjects in the assessments be used so that 1) a sufficient number of
curriculum when they graduate. They can be provided samples are obtained, and 2) a sufficient variety of
with the opportunity to use acquired knowledge and skills measures are used. Variety of measurement can be
in the real world. So the use of authentic assessment accomplished by assessing the students through different
methods will enable teachers to directly check for the measures that allow the teacher to see them apply what
they have learned in different ways and from different will indicate that they performed well on the task, i.e.,
perspectives. they have met the standards.
Authentic tasks tend to give the students more freedom in
how they will demonstrate what they have learned. By 4. For each criterion, identify two or more levels of
carefully identifying the criteria of good performance on performance along which students can perform which will
the authentic task ahead of time, the teacher can still make sufficiently discriminate among student performance for
comparable judgments of student performance even that criterion. The combination of the criteria levels of
though student performance might be expressed quite performance for each criterion will be your rubric for that
differently from student to student. task (assessment).
For example, the products students create to demonstrate 1. IDENTIFYING STANDARDS
authentic learning on the same task might take different Standards have to focus not only on the content of the
forms (e.g., posters, oral presentations, videos, web sites). discipline but also on critical thinking problem-solving
Or, even though students might be required to produce the abilities, collaborative skills, and personal development.
same authentic product, there can be room within the
There are three types of standards that a teacher has to
product for different modes of expression. For example,
consider in identifying what he/ she wants students to
writing a good persuasive essay requires a common set of
skills from students, but there is still room for variation in learn: content; process; and value.
how that essay is constructed.
1. Content Standards are statements that describe what
CREATING AUTHENTIC ASSESSMENTS students should do within the content of a specific
There are six basic questions a teacher has to answer in learning area or the intersection of two or more learning
developing an authentic assessment: areas. Ex:

Q l: What should students know and be able to do? This  Students will classify objects along dimensions.
list of knowledge and skills becomes your STANDARDS.  Describe effects of physical activity on the body
 Present employment-related information in the target
Q2: What indicates students have met if students have met language.
these standards, you design or select relevant
AUTHENTIC TASKS. 2. Process Standards are statements that describe skills
students should develop to enhance the process of
Q3: What does good performance on this task look like? learning. Process standards are not specific to a particular
To determine if students have performed well on the task, discipline, but are generic skills that are applicable to any
you will identify and look for characteristics of good subject area. Ex:
performance called CRITERIA.
 Students will set realistic goals for their performance.
Q4: How well did the students perform? To discriminate  Seriously consider the ideas of other
among student performance across criteria, you will create  Find and evaluate relevant information.
a RUBRIC
3. Value Standards are statements that describe attitudes
Q5: How well should most students perform? The teachers would like students to develop towards learning.
minimum level at which you would want most students to Ex:
perform is your CUT SCORE or BENCHMARK.
 Students will value diversity of opinions or perspectives.
Q6:What do students need to improve upon? Information  Take responsible risk.
from the rubric will give students feedback and allow you  Persist on challenging tasks.
to ADJUST INSTRUCTION.
Given the definitions listed above, the same standard
Based on the foregoing questions, there are four steps to could be either content or process standard. For example,
follow in creating authentic assessment. the standard students will write a coherent essay would be
a process standard in a history course because it is not
1. Identify standards for your students.
describing content within the discipline of history. Rather,
2. For a particular standard or set of standards, develop a it describes a useful skill that historians should have along
task your students could perform that would indicate that with those working in other disciplines. However, if the
they have met these standards. same standard were part of an English composition
3. Identify the, characteristics of good performance on that course, it can be considered as a content standard because
task, the criteria, that, if present in your students' work, Students would be learning the content of that discipline.
2. SELECTING AN AUTHENTIC TASK
An authentic task is an assignment given to students reveals their understanding of certain concepts and skills
designed to assess their ability to apply standard-driven and/or their ability to apply, analyze, synthesize or
knowledge and skills to real-world challenges. Thus, a evaluate those concepts and skills. It is similar to a
task we ask students to perform is considered authentic constructed-response item in that students are required to
when: construct new knowledge and not just select a response.
a. students are asked to construct their own responses However, performances typically are more substantial in
rather than select from ones presented, and depth and length, more broadly conceived, and allow more
b. the task replicates challenges faced in the real world. time between the presentation of the prompt and the
student response than constructed-response items. Ex:
If I were teaching you how to play volleyball, I would not conducting an experiment; musical, dance or dramatic
determine whether you had met my standards by giving performances; debates; athletic competition; and oral
you a multiple-choice test. I would put you out on the presentation.
volleyball court to "construct your responses" in the face
of real-world challenges. Thus, the most meaningful 3. IDENTIFYING CRITERIA FOR THE TASK
assessments ask students to perform authentic tasks. In the process, you need to answer the question, "What
Authentic assessments include tasks such as does good performance on this task look like?" or "How
performances, products and constructed-response items will I know if they have done good job on this task?" In
that typically require more direct application of answering those questions you will be identifying the
knowledge and skills. These types of tasks are described criteria for good performance on that task. You will use
below along with common examples of each. those criteria to evaluate how well students completed the
task and, thus, how well they have met the standard or
1. Constructed Response. In response to a prompt, standards.
students construct an answer out of old and new
knowledge. Since there is no one exact answer to these 6 standards addressed to some degree by this authentic
prompts, students are constructing new knowledge that task. Students will be able to:
likely differs slightly or significantly from that constructed 1. Measure quantities using appropriate units,
by other students. Typically, constructed response prompts instruments, and methods;
are narrowly conceived, delivered at or near the same 2. Set up and solve proportions
time a response is expected and are limited in length. 3. Develop scale models
However, the fact that students must construct new 4. Estimate amounts and determine levels of accuracy
knowledge means that at least some of their thinking must needed
be revealed. Examples include short-answer essay 5. Organize materials
questions; show your work; concept maps; figural 6. Explain their thought process
representation like Venn diagram; typing test; completion
of a step of science lab; construction of a short musical, The authentic task to be used to assess these standards in a
dance or dramatic response; exhibition of an athletic skill. geometry class is as follows:

2. Product. In response to a prompt (assignment) or series Rearrange the Room


of prompts, students construct a substantial, tangible You want to rearrange the furniture in some room in your house,
product that reveals their understanding of certain but your parents do not think it would be a good idea. To help
concepts and skills and/or their ability to apply, analyze, persuade your parents to rearrange the furniture you are going to
synthesize or evaluate those concepts and skills. It is make a two dimensional scale model of what the room would
similar to a constructed-response item in that students are ultimately look like. Procedure:
required to construct new knowledge and not just select a
1) You first need to measure the dimension of the floor space,
response, However, product assessments typically are
location and dimensions of all doors and windows, and amount
more substantial in depth and length, more broadly of floor space occupied by each item of furniture. All should be
conceived, and allow more time between the presentation explicitly listed.
of the prompt and the student response than constructed- 2) Then find the scale dimensions of the room and all the items.
response items. Ex: essays, stories or poems; research 3) Next you will make a scale blueprint of the room labeling
reports; extended journal responses; art exhibit or where all windows and doors are on poster paper.
portfolio; lab reports; newspaper; and poster. 4) You will also make scale drawings of each piece of furniture
on a cardboard sheet of paper, and these models need to be cut
3. Performance. In response to a prompt (assignment) or out.
series of prompts, students construct a performance that
5) Then you will arrange the model furniture where you want In building a rubric, you need to go back to the criteria
them on your blueprint, and tape them down. you identified in Step 3. Do remember that you have to
6) You will finally write a brief explanation of why you believe keep the number of criteria manageable. Moreover, you do
the furniture should be arranged the way it is in your model. not have to look for everything in every assessment.
7) Your models and explanations will be posted in the room and
the class will vote on which setup is the best. Creating an Analytic Rubric. - performance is judged
Finally, the criteria which the teacher identified as indicators of separately for each criterion. Teachers assess how well
good performance on the Rearrange the Room task were: students meet a criterion on a task, distinguishing between
 accuracy of calculations; work that effectively meets the criterion and work that
 accuracy of measurements on the scale model; does not meet it. The next step in creating a rubric, then, is
 labels on the scale model; deciding how fine such a distinction should be made for
 organization of calculations; each criterion. For example, if you are judging the amount
 neatness of drawings; of eye contact a presenter made with his/her audience, that
 clear explanations. judgment could be as simple as did he or did he not make
eye contact (two levels of performance), never, sometimes
Based on the foregoing example what does a good or always made eye contact (three levels), or never, rarely,
criterion look like? A good criterion should possess the sometimes, usually, or always made eye contact (five
following characteristics: levels).
 clearly stated; Generally, it is better to start small with fewer levels
 brief;
because it is usually harder to make more fine distinctions.
 observable;
 statement of behavior; and For eye contact, you might begin with three levels such as
 worded in language students understand. never, sometimes and usually. Then if, in applying the
rubric, when you find that some students seemed to fall in
In addition to the foregoing, make sure each criterion is between never and sometimes, and never or sometimes
distinct. Although the criteria for a single task will did not adequately describe the students performance, add
understandably be related to one another, there should not a fourth (e.g., rarely) and, possibly, a fifth level to the
be too much overlap between them. Are you really rubric.
looking for different aspects of performance on the task
In the ff. portion of elementary rubrics, the criteria are:
with the different criteria, or does one criterion simply
 Observations are through
rephrase another one? For example, the following criteria
might be describing the same behavior depending on what  Prediction are reasonable,
you are looking for:  Conclusions are based on observation
Criteria Limited Acceptable Proficient
 interpret the data made good Observations are Most All
observations absent or vague observations are observations
 draw a conclusion from the data clear and are clear and
detailed detailed
Another overlap occurs when one criterion is actually a made good Predictions are Most Predictions All
subset of another criterion. For example, the first criterion prediction absent or are reasonable predictions
irrelevant are reasonable
below probably subsumes the second:
appropriate Conclusion is Conclusion is Conclusion is
conclusion absent or consistent w/ consistent w/
 presenter keeps the audience's attention inconsistent w/ most observations
 presenter makes eye contact with the audience observations observations
Rubrics are very flexible tools. Just as the number of
Like standards, criteria should be shared with students levels of performance can vary from criterion to criterion
before they begin a task so they know the teacher's in an analytic rubric, Points or value can be assigned to the
expectations and have a clearer sense of what good rubric in a myriad of ways. For example, a teacher who
performance should look like. Some teachers go further creates a rubric might decide that certain criteria are more
and involve the students in identifying appropriate criteria important to the overall performance on the task than
for a task. The teacher might 'ask the students "What other criteria. So, one or more criteria can be weighted
characteristics does a good paper have?" or "What should more heavily when scoring the performance.
I see in a good scale model?" or "How will I (or anyone) For example, in a rubric for solo auditions, a teacher might
know you have done a good job on this task?" consider five criteria: (how well students demonstrate)
4. CREATING THE RUBRIC vocal tone, vocal technique, rhythm, diction and
musicality. For this teacher, musicality might be the most
important quality that she has stressed and is looking for Homework Problem Rubric
in the audition. She might consider vocal technique to be ++ (3 pts.)
less important than musicality but more important than • most or all answers correct, AND
• most or all work shown
the other criteria. So, she might give musicality and vocal + (1 pt.)
technique more weight in her rubric. She can assign • at least some answers correct, AND
weights in different ways. Following is an example of a • at least some but not most work shown
common format. — (0 pts.)
0 1 2 3 4 5 Weight • few answers correct, OR
Vocal tone • little or no work shown
Vocal technique Although this homework problem rubric only has two
Rhythm criteria and three levels of performance, it is not easy to
Diction write such a holistic rubric to accurately capture what an
Musicality
evaluator values and to cover all the possible
In creating rubric you are not bound by any format
combinations of student performance. For example, what
constraints. The rubric should capture what you value in
if a student got all the answers correct on a problem
performance on the authentic task. The accurately your
assignment but did not show any work? The rubric covers
rubric captures what you want your students to know and
that: the student would receive a (—) because "little or no
be able to do, the more valid the scores will be.
work was shown." What if a student showed all the work
but only got some of the answers correct? That student
Holistic Rubric – a judgement of how well someone has
would receive a (+) according to the rubric. All such
performed on a task considers all the criteria together, or
combinations are covered. But does giving a (+) for such
holistically. Thus, each level of performance in a holistic
work reflect what the teacher values? The above rubric is
rubric reflects behavior across all the criteria. It is
designed to give equal weight to correct answers and work
recommended that the use of holistic rubrics be limited to
shown. If that is not the teacher's intent then the rubric
situations when the teacher wants to:
needs to be changed to fit the goals of the teacher.
 Make a quick, holistic judgment that carries little
weight in evaluation, or Checking Your Rubric. This is the last step in creating a
 Evaluate performance in w/c the criteria cannot be rubric. As a final check on your rubric before applying it,
easily separated. consider the following guidelines:
Oral Presentation Rubric  Let a colleague review it.
Mastery  Let your students review it — is it clear to them?
• usually makes eye contact  Check if it aligns or matches up with your standards.
• volume is always appropriate  Check if it is manageable.
• enthusiasm present throughout presentation  Consider imaginary student performance on the rubric
• summary is completely accurate
Proficiency
• usually makes eye contact
CHAPTER 20: ASSESSING STUDENT PORTFOLIOS
• volume is usually appropriate
Portfolio is a purposeful collection of student work that
• enthusiasm is present in most of presentation
• only one or two errors in summary exhibits student's efforts, progress and achievements in
Developing one or more areas. It is a form of alternative assessment
• sometimes makes eye contact intended to accumulate evidence to measure growth over
• volume is sometimes appropriate time of a student's or teacher's performance. Each
• occasional enthusiasm in presentation portfolio might contain a selection of exemplars of the
• some errors in summary
student's work, e.g., written products, journals, post
Inadequate
• never or rarely makes eye contact volume is inappropriate
writes, reflections, graphics, spreadsheets, e-mail
• rarely shows enthusiasm in presentation messages, web sites, digital video, questionnaires, and
• many errors in summary interviews.
Quick, holistic judgments are often made for homework
Portfolio assessment is the systematic, longitudinal
problems or journal assignments. To allow the judgment
collection of student work created in response to specific,
to be quick and to reduce the problem illustrated in the
known instructional objectives and evaluated in relation to
above rubric of fitting the best category to the
the same criteria. It is a measurement strategy that
performance, the number of criteria should be limited. For
provides for the gathering of multiple measures from
example, here is a possible holistic rubric for grading
different forms of assessment throughout a student's
homework problems.
educational program. Assessment is done by measuring
the individual works as well as the portfolio as a whole beyond standards set by school systems and standardized
against specified criteria which match the objectives testing programs. They provide students and teachers
toward a specific purpose. Portfolio creation is the with creative, systematic, and visionary ways to learn,
responsibility of the learner, with teacher guidance and assess, and report skills, processes, and knowledge
support, and often with the involvement of peers and associated with the 21st century.
parents.
Using the Portfolio in the Assessment of Learning
Portfolio assessment system which reflects the belief that
the richest description of student performance must be  Portfolio assessment matches assessment to teaching.
The products that are assessed are mainly products of
done through multiple measures of student performance.
class work, and are not detached from class activities
Evidences gathered in the process provide verification of like test items.
enabling skills as well as verification of outcomes in  Portfolio assessment has clear goals. These goals are
multiple content areas and real-life settings. decided at the beginning of instruction and are clear to
the teacher and his/her students.
Some of the key characteristics of portfolio assessment  Portfolio assessment gives a profile of the learner's
abilities. It enables students to show quality work,
 A portfolio is a form of assessment that students do demonstrate a wide range of skills, and show efforts
together with their teachers. at improving, developing and demonstrating progress
 A portfolio is not just a collection of student work but over time.
a selection — the student must be involved in  Portfolio assessment is a tool for assessing a variety
choosing and justifying the pieces to be included. of skills. It can include written as well as oral and
 A portfolio provides samples of the student's work graphic products of student learning.
which show growth over time. By reflecting on their  Portfolio assessment develops among students
own learning, students begin to identify strengths and awareness of their own learning. Students have to
weaknesses in their work. These weaknesses become reflect on their own progress and the quality of their
improvement goals. work in relation to identified goals
 Portfolio assessment caters to individuals in the
 The criteria for selecting and assessing the portfolio
heterogeneous class. Since it is open-ended, students
contents must be clear to the teacher and the students
can show work on their own level, In as much as there
at the beginning of the process. is a choice, It caters to different learning styles and
 The entries in the portfolio can demonstrate learning allows expression of different strengths .
and growth in all learning competencies.  Portfolio assessment develops social skills. Students
are also assessed on work done collaboratively, in
Below are some strengths of portfolio assessment, seen
pairs or in groups, on projects and assignments.
in contrast to traditional forms of assessment.
 Portfolio assessment develops independent and active
Traditional Portfolio learners. Students must select and justify portfolio
Measures student's ability at Measures student's ability over choices; monitor progress and set learning goals.
one time time  Portfolio assessment can improve motivation for
Done by teacher alone; student Done by teacher and student; learning and achievement. Empowerment of students
often unaware of criteria student aware of criteria
Conducted outside instruction Embedded in instruction to prove achievement has been found to be highly
Assigs students a grade Involves student in own motivating and encouraging.
assessment  Portfolio assessment is an efficient tool for
Does not capture the range of Captures many facets of language
student's language ability learning performance demonstrating learning. Different kinds of products
Does not include the teacher's and records of progress fit conveniently into one
Allows for expression of teacher's
knowledge of student as a package. Changes over time are clearly shown.
knowledge of student as learner
learner
Does not give student Student learns how to take
 Portfolio assessment provides opportunity for
responsibility responsibility teacher-student dialogue. It enables the teacher to get
to know each and every student. It also promotes
joint-goal setting and negotiation of grades.
Why Use Portfolio Assessment
Other uses of portfolios according to Campbell and his
Portfolios provide teachers with a tool for showing what, colleagues are the following:
how, and how well students learn both intended and
incidental outcomes related to their curricula and student  To promote student self-assessment and control of
needs and interests. Portfolios are aligned with and go learners;
 To support student-led parent conferences; showcase portfolio. For students, best work is often
 To select students into-special programs; associated with pride and a sense of accomplishment and
 To certify student competence; can result in a desire to share their work with others. Best
 To grant alternative credit; work can include both product and process. It is often
 To demonstrate to employers certain skills and correlated with the amount of effort that learners have
abilities; invested in their work. A major advantage of this type of
 To build student self-confidence; and portfolio is that learners can items that reflect their highest
level of learning and can explain why these items
 To evaluate curriculum and instruction.
represent their best effort and achievement. Best work
It is very clear that portfolios can provide teachers with a portfolios are used for the following purposes:
tool for showing what, how, and how well students learn
1. Student Achievement. Students may select a given
both intended and incidental outcomes related to their
number of entries that reflect their best effort or
curricula and student needs and interests. Portfolios are
achievement (or both) in a course of study. The portfolio
aligned with and go beyond standards set by school
can be presented in a student-led parent conference or at a
systems and testing programs. They can provide students
community open house. As students publicly share their
and teachers with creative, systematic, and visionary
excellent work, work they have chosen and reflected upon,
ways to learn, assess, and report skills, processes, and
the experience may enhance their self-esteem.
knowledge associated with the 21st century.
2. Post-Secondary Admissions. The preparation of a
Some advantages of using portfolio
post-secondary portfolio targets work samples from high
 Serves as a cross-section lens, providing a basis for school that can be submitted for consideration in the
future analysis and planning. By viewing the total process of admission to college or university. This
pattern of a student, one can identify areas of portfolio should show evidence of a range of knowledge,
strengths and weaknesses, and obstacles to success; skills, and attitudes, and may highlight particular qualities
 Serves as a concrete vehicle for communication, relevant to specific programs. Many colleges and
providing on-going communication or exchanges of universities are adding portfolios to the initial admissions
information among those involved in assessment process while others are using them to determine
 Promotes a shift in ownership: students take an active particular placements once students are admitted.
role in and what they want to accomplish; the more 3. Employability. The audience for this portfolio is an
complex and important aspect of a learning area or employer. This collection of work needs to be focused on
subject matter; and specific knowledge, skills, and attitudes necessary for a
 Covers a broad scope of knowledge and information particular job or career. The school to-work movements in
from many different people involved in the North America are influencing an increase in the use of
assessment of students ' learning and achievement. employ-ability portfolios. The Conference Board of
On the other hand, some disadvantages of using Canada (1992), for example, outlines the academic,
portfolio assessment include the following: personal management, and teamwork skills that are the
foundation of a high-quality Canadian workforce. An
 It may be seen as less reliable or fair than more employability portfolio is an excellent vehicle for
quantitative evaluations such as test scores. showcasing these skills.
 Having to develop one's individualized criteria can be
B. Growth Portfolio. A growth portfolio demonstrates an
difficult or unfamiliar at first.
individual's development and growth over time.
 It can be very time consuming for teachers to organize
and evaluate the content of portfolios especially if Development can be focused on academic or thinking
portfolios have to be done in addition to traditional skills, content knowledge, self-knowledge, or any area that
testing and grading. is important in your setting. A focus on growth connects
 If goals and criteria are not clear, the portfolio ..can be directly to identified educational goals and purposes.
just a miscellaneous collection of artifacts that do not When growth is emphasized, a portfolio will contain
show patterns Of growth. and achievement. evidence of struggle, failure, success, and change. The
 Like other forms of qualitative data, data from growth will likely be an u even journey of highs and
portfolio assessments can be difficult to analyze or lows, peaks and valleys, rather than a smooth continuum.
aggregate to show change. What is significant is that learners recognize growth
TYPES OF PORTFOLIOS
whenever it occurs and can discern the reasons behind that
growth. The goal of a growth portfolio is for learners to
A. Best Work Portfolio. This type of portfolio highlights see their own changes over time and, in, turn, share their
and shows evidence of the best work of learners. journey with others.
Frequently, this type of portfolio is called a display or
A growth portfolio can be culled to extract a best work which to make decisions on assessment. The optional
sample. It also helps learners see how achievement is items permit each student to represent his or her
often a result of their capacity to self-evaluate, set goals, uniqueness.
and work over time. Growth portfolios can be used for the
4. Dates. Specific dates have to be included for all entries
following purposes:
to facilitate evidence of growth over time.
1. Knowledge. This portfolio shows students' growth in
5. Drafts. Drafts of Oral, aural, and written products and
knowledge in a particular content area or across several
revised versions have to be included in the portfolio.
content areas over time. This kind of portfolio can contain
samples-of both satisfactory and unsatisfactory work, 6. Reflections. Reflections can appear at the different
along with reflections to guide further learning. stages in the learning process. Through reflection, students
can express their feelings regarding their progress and or
2. Skills and Attitudes. This portfolio shows students'
themselves as learners• Questions that students have to
growth in skills and attitudes in areas such as academic.
consider in making reflections for each item in the
disciplines, social skills, thinking skills, and work habits.
portfolio follows.
In this type of portfolio, challenges, difficult experiences,
 What did I learn from it?
and other growth events can be included to demonstrate
 What did I do well?
students' developing skills. In a thinking skills portfolio,
 Why did I choose this item?
for example, students might include evidence showing
 What do I want to improve in the item?
growth in their ability to recall, comprehend, apply,
 How do feel about my performance?
analyze, synthesize, and evaluate information.
 What were the problem areas or difficulties I encountered?
3. Teamwork. This portfolio demonstrates growth in Portfolios are collections of students' work over time. A
social skills in a variety of cooperative experiences. Peer portfolio often documents a student's best work and may
responses and evaluations are vital elements in this include other types of process information, such as drafts
portfolio model, along with self-evaluations. Evidence of of the student's work, the student's self-assessment of the
changing attitudes resulting from team experiences can work, and the parents' assessment. Portfolios may be used
also be included, especially as expressed in self-reflections for evaluation of a student's abilities and improvement.
and peer evaluations. In recent years, portfolios of students' performance and
4. Career. This portfolio helps students identify personal products have gained impressive degrees of support from
strengths related to potential career choices. The collection educators, who view them as a way to collect authentic
can be developed over several years, perhaps beginning in evidence of student's learning. For many educators,
the grade school and continuing throughout high School. portfolios are an attractive alternative to more traditional
The process of selecting pieces over time empowers assessment approaches.
young people to make appropriate educational choices 3 basic models of what portfolios should contain:
leading toward meaningful careers. Career portfolios may
contain items from outside the school setting that 1. Showcase model, consisting of work samples chosen
substantiate students' choices and create a holistic view of by the student.
the students as learners and people. This type of portfolio 2. Descriptive model, consisting of representative work
may be modified for employment purposes. of the student, with no attempt at evaluation.
3. Evaluative model, consisting of representative
ESSENTIAL ELEMENTS OF A PORTFOLIO
products that have been evaluated by criteria.
1. Cover Letter. This element tells about the author of the
Assumptions about portfolio assessment:
portfolio and what the portfolio shows about the author's
progress as a learner. It summarizes the evidence of a  Portfolios are systematic, purposeful, and meaningful
student's learning and progress. collections of students' works in one or more subject
areas.
2. Table of Contents. Shown in this element are the
 Students of any age or grade level can learn not only
detailed contents of the portfolio.
to select pieces to be placed into their portfolios but
3. Entries. Entries in a student's portfolio can either be can also learn to establish criteria for their selections.
core or optional. Core entries are items students have to  Portfolio collections may include input by teachers,
include, while optional entries are items of students' parents, peers, and school administrators.
choice. The core elements provide a common base from
 In all cases, portfolios should reflect the actual day-to-
day learning activities of students. Stage 4: Giving Clear and Detailed Guidelines for
 Portfolios should be ongoing so that they show the Portfolio Presentation.
students' efforts, progress, and achievements over a Explain the need for clear and attractive presentation,
dated drafts, and attached reflections or comment cards.
period of time.
Explain how the portfolio will be graded and when it
 Portfolios may contain several compartments, or needs to be ready. As unfamiliar ways of teaching and
subfolders. assessment are potentially threatening and confusing to
 Selected works in portfolios may be in a variety of students, it is important to present the portfolio guidelines
media and may be multidimensional. clearly and to review these guidelines periodically
STAGES IN IMPLEMENTING PORTFOLIO
Stage 5: Notifying Other Interested Parties.
ASSESSMENT
Make sure that the school principal is aware of your new
Stage 1: Identifying Teaching Goals to Assess Through assessment procedures. Do inform parents as well about
the Portfolio. the portfolio assessment and involve them in commenting
The very first and most important part of organizing on the work of their children.
portfolio assessment is to decide on the teaching goals.
Stage 6: Preparing the Portfolio.
These goals will guide in the selection and assessment of
Support and encouragement are necessary for both the
students' work for the portfolio. To do this, you need to teacher and students at this stage. The students can get it
ask yourself "What do I want my students to learn?" and from patient and understanding teacher. Teachers will get
choose several goals to focus on. This stage is so it by doing portfolio assessment as teamwork or by joining
important as teachers have to know what their goals are in or initiating a support group to discuss questions with
terms of what students will be able to do. Moreover, fellow teachers as they arise.
students have to know what they need to show as evidence Devote class time to student-teacher conferences, to
in their portfolio. Identifying these goals can also be done practicing reflection and self-assessment and to portfolio
by involving students in the process. preparation as these skills may be new for most students.
Remember that reflection and self-assessment do not
Stage 2: Introducing the Idea of Portfolios to Your naturally come to people who have had little practice in it.
Class. Give guiding feedback. The finished portfolio may be due
As a teacher, you will need to present the idea of a only at the end of the second quarter but it is good idea to
portfolio to your class. Explain to your class patiently
set regular dates at which time several portfolio-ready
what a portfolio is and the reasons for having one. You
items will be handed in, so that students know whether
need to make your students realize that a portfolio is
simply a selection of their work that can showcase they are on the right track.
progress in the different areas or skills in a learning area. To ensure that the portfolio represents the student's own
To further make the students understand what a portfolio work, some items can be done completely in class. You
is, direct their attention to the main aspect of portfolios,
might also decide to have a test with corrected version
which is their use as an assessment tool. Explore with the
students their feelings about tests, whether they feel that included as a core item together with reflection on what a
tests truly represent what they know and can do. Make student learned from doing the test and revising it.
them feel that you are going to assess their learning in a Moreover, you may ask the students to explain in their
fairer way, which will show the many different skills, reflections who helped them improve their work and what
knowledge, and ideas they have acquired. Moreover, they learned from revising their work.
inform the students how much weight the portfolio will
have in their final grade and what it is going to replace, Stage 7: Assessing the Portfolios and Giving Feedback.
like one oi more of their quizzes and or projects. Each portfolio entry needs to be assessed with reference to
its specific goal. Since the goals and weighting of the
Stage 3: Specifying Portfolio Content. various portfolio components have been clearly specified
Specify what and how much has to be included in the in advance, assessing the Portfolios is not difficult. Self
portfolio, both core and options. Let them know that there and peer assessment can be used too as a tool for
are many forms of portfolio entries they can include in
formative evaluation, with the students having to justify
their portfolio like written, audio and video recorded
their grade with reference to the goals and to specific
items, artifacts, dialogue, and journals. Specify for each
entry how it will be assessed. There is a need for students pages in the portfolio. This makes the teacher's job much
to be aware of the scoring guides/ rating scales that will be simpler considering that the student has done the
used before performing the task. groundwork of proving how far each goal is met in the
portfolio. In the process, students are able to internalize mastery. This allows a richer understanding of the process
the criteria for quality work. of change.
After all the efforts students invested in their portfolios,
4. Explicit (purpose and goals are clearly defined). The
the teacher needs to provide feedback that is more than students should know in advance what is expected of
just a grade. One way is by writing a letter about the them, so that they can take responsibility for developing
portfolio identifying its detailed strengths and weaknesses their evidence.
and.at the same time, generating a profile of a student's
ability, which has to be added to the portfolio. Still 5. Integrated (evidence should establish a correspondence
another option is by preparing certificates commenting on between program activities and life experiences). Students
should be asked to demonstrate how they can apply their
the strengths and weaknesses of the portfolio and
skills or knowledge to real life situations.
suggesting future goals.
6. Based on Ownership (the student helps determine
Stage 8: Holding Student-Teacher Conferences. evidence to include and goals to be met). The portfolio
An important element of the portfolio philosophy of assessment process should require that the participants
shared and active assessment is that the teacher should engage in some reflection and self-evaluation as they
have short individual meetings with each student, in select the evidence to include and set or modify their
which progress is discussed and the goals are set for goals. They are not simply being evaluated or graded by
future meetings. Students and teachers have to document others.
these meetings and keep the goals in mind when selecting
7. Multi-purposed (allowing assessment of the
topics for future meetings. In, this way, ,student-teacher
curriculum while assessing the performance of the
conferences play an important role in the formative students). A well-designed portfolio assessment process
evaluation of a student's progress can also be used for evaluates the effectiveness of the curriculum at the same
summative evaluation purposes when the student presents time that it evaluates the growth of the students. It also
his final portfolio product and together with the teacher serves as a communication tool when shared with the
decides on a final grade. This is a student's chance to members of the school community. Moreover, it can be
negotiate the portfolio grade using evidence Of passed on to other teachers as the student moves from one
achievement according to agreed goals. Even notes from grade level to another.
these conferences can be included in the portfolio as they
CHAPTER 21: GRADING AND REPORTING
contain joint decisions about the student's strengths and
PERFORMANCE
weaknesses.
Stage 9: Follow-Up, Grading students' performance during a particular marking
After portfolios are completed, it is a good idea to have an terra is an evaluation function of every teacher. Aside
exhibition of portfolios and on student-led parent-teacher from grading student’s performance, teachers have to
conference, in which students present their-portfolios to inform the students and their parents of the students'
their parents. academic progress in the various learning areas in the
curriculum. This chapter centers on two vital functions of
CHARACTERISTICS ESSENTIAL TO THE every teacher — that of grading and reporting
DEVELOPMENT OF ANY TYPE OF PORTFOLIO performance.

1. Multi-sourced (allowing for the opportunity to evaluate Purposes for Assigning Grades
a variety of specific evidence). Multiple data sources
include statements and observations of participants and Grades are symbols used by teachers in conveying a
artifacts like test scores to photos, drawings, journals, and student's performance in a learning area. Grades are given
audio or videotapes of performances. to student performance for the following purposes:

2. Authentic (context and evidence are directly linked).  To inform students of their educational progress;
 To inform parents of their children's achievement and
The items selected or produced for evidence should be
performance;
related to the identified goal or benchmark, For example,
 To guide further coursework;
if a child's musical performance skills were gained
 To provide a basis for grouping students for instructional
through piano lessons, an audio tape would be relevant. purposes;
3. Dynamic (capturing growth and change). An important  To provide a basis for graduating students from different
feature of portfolio assessment is that data or evidence is levels within the educational system;
 To provide a basis for college admission;
added at many points in time, not just before and after
 To determine honor students;
measures. Rather than including only the best work, the
 To serve as basis for granting of awards and scholarships;
portfolio should include examples of different stages of
 To provide a criterion for students' participation in extra-  Decide beforehand on the policy you will implement for
curricular activities; makeup work in case of absence or sickness.
 To motivate students to study hard;
 To identify students' strengths and weaknesses; and On the basis of the foregoing guidelines, it is very clear
 To provide basis for employment. that a teacher needs to come up with criteria for grading to
ensure objectivity and consistency in giving term mark for
Strategies in Grading each student. Regardless of the subject area a teacher is
handling, the most commonly used criteria for grading are
There are no hard-and-fast rules about the best ways to
the following:
grade. How a teacher grades a student depends on his
values, assumptions and philosophy as a teacher. There is  Class Standing (Quizzes, Exercises, etc);
a consensus among teachers that grades provide  Participation/Involvement;
information on how well students are learning.  Summative Tests;
 Projects; and
 Grade on the basis of student's mastery of knowledge and
 Class Attendance & Attitude.
skills. Restrict your evaluations to academic performance.
Eliminate other considerations as bases of grades. Non-
Having identified the criteria for grading, the teacher needs to
academic factors counted in grading obscure the primary
assign weight for each criterion. For instance, a Social Studies
meaning of the grade, as an indicator of what students have
learned. teacher can use the following criteria for grading, with their
 Avoid grading systems that put students in competition corresponding weights:
with their classmates and limit the number of high grades. Using the criteria and weights for each criterion, the grade for a
Normative grading produces undesirable consequences for
marking term can now be computed.
many students, such as reduced motivation to learn,
debilitating evaluation anxiety, decreased ability to use What grade should Ulysses receive in Social Studies given the
feedback to improve learning and poor social relationships. grades he obtained in the different criteria for grading in the
 Try not to overemphasize grades. Explain to your students said subject for the first grading period?
the meaning of and basis for grades and the procedures you
use in grading. Once policies have been explained, avoid An analysis of how the grade was arrived at shows that each
stressing grades or excessive talk about grades, which only grade per criterion is multiplied by its corresponding weight.
After summing up the cross products of each grade and
increases students' anxieties and decreases their motivation
corresponding weight, the grand total is divided by 100.
to do something for its own sake rather than obtain an
external reward such as a grade. COMPUTATION OF THE FINAL GRADE
 Keep students informed of their progress. For each paper,
assignment, summative test or project that you grade, give 1. Averaging system, grades obtained-by-a student in
the students a sense of what their score means. Such each of the marking terms are added. The sum of these
information can motivate students to improve what they are grades is divided by the number of marking terms. If there
doing poorly or to maintain their performance if they are are four grading or marking periods, final grade is arrived
doing well. at by applying the following formula:

Determining Grades for a Marking Term


2. Cumulative system, the final grade is obtained by
Deciding whether to pass or fail a student involves getting 30% of the grade obtained during the previous
professional decisions. In college, there are usually three marking period and 700/0 of the last grading period.
marking terms: prelim; midterm; and finals. In elementary Mathematically, final grade can be obtained following the
and high school, there are four: first grading; second formula:
grading; third grading; and fourth grading. There are, The formula above is the one used in the public elementary and
however, guidelines that teachers should consider in secondary schools in computing the final grade in all-learning
arriving at a student's grade for a specific marking term. areas.

 Explain to the students your marking and grading policies. Illustrative Example: What final grade should Lito receive in
These policies have to be understood by the students' Statistics if his GP= 94 and LG= 97? Let us apply the formula
parents or guardians. Be objective in giving marks to earlier cited.
An alternative formula that can be applied to arrive at the final
students' accomplishment/ performance.
grade using the cumulative system is given below:
 Build your grading policy around the concept of
accomplishment rather than failure. REPORTING STUDENT PERFORMANCE
 Consider the policies of the school in giving the highest and
lowest grade. One of the major responsibilities of a teacher is to
 Use a variety of sources for determining students' grades for periodically report student progress to parents or
a marking term. guardians. There are three ways by which teachers can
report student progress and academic performance in
school. These ways are discussed below.
1. The Report Card is given to the student's parent or
guardian after each grading period. The attendance record
of the student is reflected in the report card, together with
the teacher's comments on the student's social behavior or
classroom conduct. In most elementary and secondary
schools, parents are given the opportunity to discuss the
academic status of their sons and daughters with their
teacher-in-charge/ adviser and subject teachers when
report cards are distributed during parent-teacher
conference.
2. Progress Reports. Parents usually receive two kinds of
notices from the School about their children. In general,
teachers are expected to send home deficiency notices for
students who are failing, misbehaving in class or
frequently not attending classes. There are, however, some
teachers who take it upon themselves to write sufficiency
notices for students who are excelling or have shown
marked improvement. These positive notices have to be
sent home and are welcomed by parents.
3. Direct Contact with Parents or Guardians. Teachers
can establish direct contact with parents or guardians
either through the telephone or by letter. Teachers should
make it a point to contact parents or guardians by
telephone, when the student has shown a sudden turn for
either the worse or better in academic performance or in
classroom behavior. That initiative by the teacher is
usually welcomed by the parents and can lead to
productive conference with the teacher. Writing a letter
gives the teacher time to think and make clear his/her
thoughts and concerns and to invite the parent to respond
at his/her convenience by letter, by phone, or by arranging
to have a conferences with the teacher.

You might also like