Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 12

LESSON 1: are relevant for your population (e.g..

"motivation to succeed"
Factors in Constructing Evaluative Instruments or "classroom engagement").
Objective Instruments
4. Search on these terms and constructs to find evaluation
instruments (including interview guides or templates for
Objective Instruments Evaluation of learning. most of the time, ethnographic research for qualitative studies) that may already
requires the use of classroom testing such as oral tests, exist.
performance tests, or pencil-and- paper tests with objective and
essay questions. These tests are considered as objective 5. Define your measurement protocol, including the data
instruments of assessing students' learning. collection techniques and the type of analysis that will be
Classification of Objective Instruments conducted on the data.

6. Collect and analyze your data according to your protocol.


The following are the description of the different tests 7. Gather evidence of reliability and validity of the instrument
classified as objective instruments as used in your study.

1. Achievement test- It measures how well a student has 8. Compare the data against your goals and pre-defined
mastered specified outcomes to determine how well you met your goals and what
areas could be further and/or improved.
instructional objectives.
SCORING PROCEDURE
2. Intelligence test. It measures the students broad range of
abilities
Score the Assessment
3. Diagnostic Test- It measures a student's strengths and
weaknesses in a specific area of study.
1. Rubric Development: Develop scoring guides that match
4. Formative Test- It measures a students progress that occurs
criteria in assessment frameworks.
over a short period of time.
5. Summative Test- This measures the extent to which the
2. Training Materials Development: Develop training
students have attained the desired outcomes for a given chapter
materials after receiving actual student responses to the items
or unit.
during a pilot assessment.
6. Aptitude test- It measures the ability, or abilities in a given
area.
3. Pilot scoring: Identify and address any mismatches between
7. Survey test- It measures general achievement in a given
what NCES expected from students, how they interpreted the
subject or area and is more concerned with the scope of
item, and what students actually provided.
coverage.
8. Performance test. It measures a student's proficiency level
4. Operational Scoring: Seek to develop consensus/agreement
in a skill. It requires manual or other motor responses.
by having the team score consistently according to the rubric
9. Personality test- This is a test that measures the ways in
and training sets.
which individual's interest is focused with other individuals or
in terms of the roles that other individual has ascribed to him
5. Trend Scoring/Monitoring: Maintain consensus by scoring
and how he adopts in the society.
consistently with how items were scored in previous years.
10. Prognostic test. It is a test which predicts the student's
Method of Recording and Reporting Assessment Result
future achievement in a specific subject area.
STEPS OF MEASUREMENT
The major steps for measurement include: Assessment is an Integral Part of What Teacher Always Do
1. Refer back to your project's original goals and pre-defined
outcomes, then identity. 1. PLANNING
Knowing and sharing what is to be learned
a. The factors that you want to measure and 2. TEACHING
Assessment as part of effective learning and teaching
b. Who will be measured.
3. RECORDING
2. Determine the most appropriate design and data collection Summarizing success and progress
measures, such as qualitative or quantitative and whether you 4. REPORTING
will use a randomized control trial, one group pre-test/post-test Providing useful feedback
measure or other study design. 5. EVALUATING
Using assessment to evaluate learning and teaching
3. Search the literature to find studies similar to yours (both in Planning
terms of goals and population). Extract the constructs (or
In planning, Assessment should be incorporated and evident in
measurement scales) described and determine which (if any)
our Forward Plans. we need to plan for groups and where parental consultation or reports to parents.
appropriate individuals. We need to plan effective tasks and Kinds of Evidences We Can Use
flexible teaching methods that maintain the correct balance
across the curriculum. We should strive to design tasks that aid
 Conversation with pupils/parents/teachers
progress and are challenging, yet attainable.
 Written evidence (pupil's work)
 What am I teaching?  Comments written on pupil's work
 How will I teach it?  Oral questioning Extension work- pupils able to apply
 Who am I teaching? knowledge/skills to new situations
 How will I know if the students understand?  Cognitive Abilities Tests (C.A.T.) results
Teaching  Co-operation in a project
Teaching is the second stage of the assessment process and is
based on the planning already done. Teachers should encourage Reporting
a problem solving approach and create an atmosphere which Reporting will contribute to communication and cooperation
promotes the exploration of new ideas and activities. Teachers amongst teachers and parents. Reporting of pupil reports serves
should use a variety of teaching and learning approaches to a number of purposes. They provide feedback to pupils, they
meet the needs of all pupils.
inform parents of their child's progress and provide agenda for
Recording
Our teachers' forward plans/programs of study can serve a dual parents' meetings, and they pass information from one teacher
purpose as plans and records. Recording will enable teachers to to another for from one school to another.
share with pupils successful learning and identify development
needs and next steps. It will monitor the effectiveness of Reporting to Pupils :
teaching and pupils' progress in relation to attainment outcomes This can take the form of discussion, written comments on
and targets. It will enable teachers to report to parents and other work, identifying areas of strengths and setting targets for areas
teachers. It will also inform the Head Teacher about attainment
needing to be worked on, encouraging any improvement or task
levels in classwork.
Reporting well done (a simple sticker is very effective!)

a. Teaching records (as part of the forward plans) Reporting to Parents:


 A brief indication of teaching methods used Mostly, It is in the school policy to issue 2 written reports per
 An evaluative comment of how the class/groups have year.
coped
 A note of next steps
Reporting to Teachers:
b. Individual Records This will take the form of passed on record sheets, pupil reports
 Folios/collections of work test results and informal discussions among teachers.
 Indications of levels of achievement
c. Summaries of overall class performance Reporting to School Board:
 Submission of test results to the Head The Head Teacher will provide information to the principal
Teacher 3 times a year about overall attainment in the school.
Assessment Schedule
Generally, Records of day-to-day progress will be kept by staff
on a group/individual
basis. At the senior end the pupils themselves can complete Daily:
most of these sheets. Regular feedback to pupils about success and
progress in daily work.
Teachers should only note significant strengths and weaknesses
in formal record sheet and indicate the need for longer-term WEEKLY:
action such as reinforcement. Their strengths and weaknesses Assess week's work e.g. spelling test, teachers worksheets,
and areas for development will be reported to parents through meeting of short-term target
school reports
What to Record??? TERMLY:
Teachers should record for each pupil only what is useful and Assess Forward Plans and targets focused for assessment.
strategies. Consider appropriate assessment Reflect on previous
relevant for planning next steps in learning and for reporting
assessment and evidence to evaluate effectiveness of teaching.
progress. This should include brief comments on progress in
relation to particular strengths and development needs. AT END OF TOPIC:
Oral and written feedback identifying success and progress and
When to Record??? areas of difficulty for the pupils.
Teachers should update their individual/group pupils records RANGE OF DIFFICULTY
termly. Recording should also take place prior to planned
The item difficulty index is a common and very useful beneficial, especially in areas where mastery is key, such as in
analytical tool for statistical analysis, especially when it comes medical certification.
to determining the validity of test questions in an educational Item discrimination is measured in a range between - 1.0 and
setting. The item difficulty index is often called the p-value 1.0. Negative discrimination indicates that students who are
because it is a measure of proportion - for example, the scoring highly on the rest of the test are answering that
proportion of students who answer a particular question question wrong. This could mean that there is a problem with
correctly on a test. P-values are found by using the difficulty the question, such as bias or even a typo in the answer key.
index formula, and they are reported in a range between 0.0 Test writers should reevaluate questions that result in negative
and 1.0 discrimination because they do not help to show mastery.
FINDING THE ITEM DISCRIMINATION INDEX
In the scenario with students answering questions on a test,
higher p-values, or p-values closer to 1.0, correspond with a Determining item discrimination is more complicated and
greater proportion of students answering that question involves more steps than finding an item's difficulty. First,
correctly. In other words, easier test questions will have greater create a table of your students along with their test scores. In a
p-values. That is why some statisticians also call the difficulty third column, indicate whether the student answered the
index "the easiness index" when they are performing an item question you are measuring correctly by placing a 1 (for correct
analysis on data sets that have to do with education. answers) or a O (for incorrect answers) in the corresponding
box.
Different types of tests aim for different levels of easiness. Now, arrange your students from highest scorers to lowest
Norm-referenced tests, for example, will have questions with scorers, with the highest scorers at the top. Divide the table in
varying levels of easiness because they are trying to create a half between high and low scorers, with an equal number of
wider spread in scores and categorize test takers into norms. students on each side of the dividing line. Subtract the number
Criterion-referenced tests, on the other hand, are trying to of students in the lower-scoring group who answered the
measure mastery. It is possible for criterion-referenced tests to question correctly (Ic) from the number of students in the
have many questions with p-values close to 1.0. higher-scoring group who answered the question correctly (hc).
Using the Difficulty Index Formula Then, divide the resulting number by the number of students on
each side of your dividing line, which should be half of the
The difficulty index formula is fairly easy to remember because class (t).
it is the same as determining the percentage of students who
answered the question correctly. The only difference is that p- Item discrimination = (hc - Ic) + t
value is left as a decimal point and is not converted to a You have a class of 20 students, so after you arrange them by
percentage value out of 100. score in a table, you should have 10 on each side of the
The formula looks like this: the number of students who answer dividing line. If six students in the higher-scoring group
a question correctly (c) divided by the total number of students answered the question correctly, and six students in the lower-
in the class who answered the question (s). The answer will scoring group also answered the question correctly, you should
equal a value between 0.0 and 1.0, with harder questions
already know without doing the math that the item is not very
resulting in values closer to O.O and easier questions resulting
in values closer to 1.0. The formula: c÷s = p discriminatory. However, we can still measure with the
Example: Out of the 20 students who answered question five, formula.
only four answered correctly. However, we can still measure with the formula.
Formula: 4 ÷ 20= 0.2 The formula for this problem should look like this:
(6-6) + 10 = 0
The new formula would look like this: (6-2) = 10 = 0.4
Although the number could be higher, this question would still
be a decent indicator of whether or not the student understood
the material.
Because the resulting p-value is closer to 0.0, we know that this
is a difficult question
Item Discrimination Index This item is not a good measure of mastery because its
The discrimination index is another way that test writers can discrimination index is zero.
evaluate the validity of their tests. Item discrimination If, however, six students in the higher-scoring group answer
evaluates how well an individual question sorts student who correctly, and only two students in the lower-scoring group
have mastered the material from students who have not. Test answer correctly, the item is a much better measure of mastery.
takers with mastery of the material should be more likely to RESTRICTED RESPONSE ITEMS & EXTENDED
answer a question correctly, whereas students without mastery RESPONSE ITEMS
of the material should get the question wrong.
Questions that do a good job of sorting those students who  Restricted response items. On restricted response items
have mastered the material from students who have not are examinees provide brief answers, usually no more than a
called "highly discriminating." Such tests can be very few words or sentences, to fairly structured questions.
 Extended response items. here items require lengthy
responses that count heavily in scoring. These items focus  Completion statement type
on major concepts of the content unit and demand higher
level thinking. an incomplete sentence is presented and the student has to
 Examinees must organize multiple ideas and provide complete it by filling in the blank.
supporting information for major points in crafting
responses. Example: _____is the Capital City of the Philippines.
Advantages of restricted response items:
 Correction type
➤ Measures specific learning outcome.
➤ Restricted response items provide for more ease of
a sentence with underlined word or phrase is presented, which
assessment
student has to replace to make it right.
➤ Restricted response item is more structured
➤ Any outcomes measured by an objective interpretive
Example: Direction: Change the underlined word/phrase to
exercise can be measured by a restricted subjective item.
make each of the following statements correct. Write your
Limitations of restricted response items answer on the space before each number.
➤ Restricts the scope of the topic to be discussed and
indicating the nature of the desired response which limits ______1. The first (1st) President of the Republic of the
students opportunity to demonstrate this behavior. Philippines is President Rodrigo Roa Duterte.
Advantages of Extended response items
➤ Measures knowledge at higher cognitive levels of education ______2. Pangasinan is the Capital City of the Philippines.
objective such as analysis, synthesis and evaluation.
➤ They expose the individual difference in terms of attitudes,  Identification Type
values and creative thinking.
Limitations brief description is presented and the student has to identify
 They are insufficient for measuring knowledge of factual what it is.
materials because they call for extensive details in
selected content area at a time. Example: To what does each of the ff refer? Write your answer
 Scoring is difficult and unreliable. on the blank. _______1. A study of past events, specifically the
people, societies, events and problems of the past.
LESSON 2.
DEVELOPMENT OF ASSESSMENT TOOLS Simple Recall Type

 KNOWLEDGE AND REASONING a direct question is presented for the student to answer using a
 TYPES OF OBJECTIVE TESTS word or a phrase.
 A PLANNING A TEST AND CONSTRUCTION OF
TABLE OF SPECIFICATION Example: Who is the national hero of the Philippines?

What is objectives Tests? 1. Short explanation type - similar to essay test but requires a
shorter answer.
Objective tests are questions whose answers are either correct or
incorrect Example: Explain in a complete sentence why the Philippines
was not really discovered by Magellan.
Types of objective Tests:
2. SELECTION TYPE
1. SUPPLY TYPE
included in the category of selection type are;
the students constructs his/her own answer to each question.
 Arrangement type
The ff. types of tests fall under Supply Type of Objective Test
are; terms or objects are to be arranged by the students in a specified
order.
 Completion drawing type
Example: Arrange the ff. events chronologically by writing A,
an incomplete drawing is do presented which the student has to B, C, D on the space provided. Identify the largest to smallest.
4 net.co complete. golf ball
sepak ball
volleyball ball 1. Identifying Test Objectives
Basketball ball
An objective test, if it is to be comprehensive must cover the
 Matching type various levels of Bloom's Taxonomy.

a list of numbered items that are related to a list of lettered


choices.
Example: We want to construct a test on the topic "Subject-
Example: Match the country in Column 1 with its capital in Verb Agreement in English" for a Grade V class. The ff. are the
Column 2. Write letters only typical objectives. The students must be able to:
Column 1
1. Philippines 1. KNOWLEDGE. Identify the subject and the verb in a given
2. Japan sentence.
3. United States
Column 2 2. COMPREHENSION. Determine the appropriate form of a
a. Manila verb to be used given the subject of the sentence.
b.Washington D.C
3. APPLICATION. Write sentences observing rules on
C. Tokyo subject-verb agreement.

 Multiple Choice 4. ANALYSIS. Breakdown a given sentence into its subject and
verb.
this type contains a question, problem or unfinished sentence
followed by several options of answer. 5. SYNTHESIS/EVALUATION. Formulate rules to be
followed regarding subject-verb agreement.
Example:
What is the study of past events? 2. Deciding on the Type of Test to be Prepared
1. History
2. Science The test objectives guide the kind of test that will be designed
3. Mathematics and constructed by the teacher. For instance fo the first four (4)
4. English levels, we may want to construct a multiple-choice type of test
while for application and judgment, we may opt to give an essay
test.

 Alternate Response Type 3. Preparing a Table of Specifications (TOS)

a test wherein there are two possible answers to the question. A table of specifications (TOS) is a test map that guides the
The TRUE-FALSE teacher in constructing a test. It ensures that there is a balance
between items that test lower order thinking skills (LOTS) and
Form of alternative response type. Variations on the True-False those which test higher order thinking skills (HOTS).
format includes Yes-NO, Agree-Disagree, and Right-Wrong.
4. Constructing the Draft of Test Items
Example:
The actual construction of the test items follows the TOS. As a
TRUE or FALSE general rule, it is advised that the actual number of items to be
1. President Emilio Aguinaldo is the 1st President of the constructed in the draft should be double the desired number of
Republic of the items. The subsequent try-out and item analysis will most likely
Phillipines eliminate many of the constructed items in the draft (either they
2. Apolinario Mabini is the "Ama ng are too difficult, too easy or non-discriminatory)
Katipunan"

Planning a Test and Construction of Table of Specification


5. Item Analysis and Try-out
Importance Step in Planning for a Test
The test draft is tried out to a group of pupils. The purpose of 3. Do not discriminate between students of varying ability as
this try out is to determine the (a) item characteristics through well as other item types.
item analysis and (b) characteristics of the test itself- validity,
reliability and practicality. 4. Can often include more irrelevant clues than do other item
types.

5. Can often lead a teacher to favor testing of trivial knowledge.


CONSTRUCTING TRUE-FALSE TEST

What is a True-False Test?


RULES IN CONSTRUCTING TRUE-FALSE TEST
True-false tests contain statements that the students marks as
being either true or false. In order to quality as true, all parts of RULE 1. Don't give a hint in the body of question.
the statement must be true. In general, true- false tests check
your knowledge of facts. RULE 2. Avoid using the word "always", "never", "often", and
other adverbs that end to be either always true or always false.
 It is a binomial-choice tests.
 True-False questions can also be called Alternative RULE 3. Avoid tricky statements with some minor misleading
Response Question for Alternative-Response Test. word or spelling anomaly, misplaced phrases, etc.
 An objective test presented in a form of simple declarative
RULE 4. Use negative statements sparingly and do not use
statement, to which the students respond indicating whether
double negatives.
the statement is true or false.

RULE 5. Avoid long sentences as these tends to be "true".


SKILLS MASTERED BY TRUE-FALSE TEST

RULE 6. Avoid using more than one idea in True-False


Good for:
question. Make your main point prominent.
 Knowledge level content
RULE 7. Avoid specific determiners or give- away qualifiers.
 Evaluating student understanding of popular
misconceptions
RULE 8. Avoid quoting verbatim from reference materials
 Concepts with two logical responses
(Acquisitions of higher-level thinking skills is not given due
importance).
ADVANTAGES OF TRUE-FALSE TEST

RULE 9. Avoid a grossly disproportionate numbers patterns in


True-False test can provide:
the occurrence of true or false statement.
 The widest sampling of the content or objectives per unit of
RULE 10. Avoid the use unfamiliar vocabulary word.
testing time;
 Scoring efficiency and accuracy;
CONSTRUCTING MULTIPLE CHOICE TEST
 Versatility in measuring all levels of cognitive ability.
 Highly reliable test scores; and
WHAT IS A MULTIPLE-CHOICE TEST?
 An objective measurement of student achievement or
ability. multiple-choice test is a question type where the students is
asked to choose one or more items from a limited list of choices.
LIMITATIONS OF TRUE-FALSE TEST
A multiple-choice question consists of a stem and the options.
1. Incorporate an extremely high guessing factor. For simple
WHAT IS STEM?
true-false test, each student has a 50/50 chance of correctly
answering the item without any knowledge of item's content. The STEM is the beginning part of the item that presents the
item as a problem to be solved, a question, or an incomplete
2. Can often lead a teacher to write ambiguous statement due to
statement to be completed.
the difficulty of writing statements which are unequivocally true
or false. WHAT IS OPTIONS?
The OPTIONS are the possible answers you can choose from,  Tables of Specifications can help student's at all ability
with the correct answer called the key and the incorrect answers Levels Learn better. By providing the table to students
called distractors. during instruction, students can recognize the main ideas,
key skills, and the relationships among concepts more
SKILLS MASTERED BY MULTIPLE-COICE TEST easily.
 The Table of Specifications can act in the same way as
Application, synthesis, analysis and evaluation levels of concept map to analyze content areas. Teachers can even
learning such as; collaborate with students on the construction of the Table of
Specifications- what are the main ideas and topics, what
> Comparison between fact and opinion. emphasis should be placed each topic, what should be on
the test?
> Interpret cause-and-effect relationships or even charts and  Open discussion and negotiation of these issues encourage
graphs. higher levels of understanding while also modeling good
learning and study skills.
> Make inferences from given data.

ADVANTAGES AND DISADVANTAGES OF MULTIPLE-


CHOICE TEST

Advantages:

> Can cover broad range of content.

> There's no room for subjectivity because it is easily scored


and graded.

> Measures a variety of levels of learning.

> When well-constructed, has proven to be an effective


assessment tool.

> Minimum of writing for students.

Disadvantages:
> Difficult to produce plausible distracters/alternative response.
Tend to focus on low level learning objectives. Construction of LESSON 3
good items is time consuming. Validity and Item Analysis
> Measuring ability to organize ideas is not possible.
ITEM ANALYSIS
Construction of Table of Specification
Item analysis is a statistical technique which is used for
TABLE OF SPECIFICATION selecting and rejecting the items of the test on the basis of their
difficulty value and discriminated power
The table of specfications (TOS) is a tool used to ensure that a
test or assessment measures the content and thinking skills that OBJECTIVES OF ITEM ANALYSIS
the test intends to measure. Thus, when used appropriately it can
provide response content and construct (ie, response process)  To select appropriate items for the final draft
validity evidence.
 To obtain the information about the difficulty value(D.V)
Tables of Specification typically are designed based on the List of all the items
of course objectives, the topics covered in class, the amount of
 To provide discriminatory power (D.I) to differentiate
time spent on those topics, textbook chapter topics, and the
between capable and less capable examinees for the items
emphasis and Space provided in the text.
 To provide modification to be made in some of the items
OBJECTIVE TEACHING TESTING
 To prepare the final draft properly (easy to difficult items) POSITIVE DISCRIMINATION INDEX

STEPS OF ITEM ANAYSIS An item is correctly answered by superiors and is


 Arrange the scores in descending order
 Separate two sub groups of the test papers not answered correctly by inferiors. The discriminative power
 Take 27% of the scores out of the highest scores and 27% range from +1 to -1.
of the scores falling at bottom
 Count the number of right answer in highest group (R.H) NEGATIVE DISCRIMINATION INDEX
and
 count the no of right answer in lowest group (R.L)
An item is correctly answered by inferiors and is
 Count the non-response (N.R) examinees

Item analysis is done for obtaining: not answered correctly by superiors.

a) Difficulty value (D.V) The formula for discrimination index(D.I)·

b) Discriminative power (D.P) D.I = (R.H - R.L)/ (N.H or N.L)


• R.H – rightly answered in highest group
DIFFICULTY VALUE (D.V) • R.L - rightly answered in lowest group
• N.H no of examinees in highest group
"The difficulty value of an item is defined as the proportion or • N.L - no of examinees in lowest group
percentage of the examinees who have answered the item Another method for discrimination index(D.I)
correctly"
If you take Likert scale for data collection scores.
- J.P. Guilford
In case non-response examinees available means, The correlation between each item scores with total
The formula for difficulty value (D.V)
D.V = (R.H + R.L)/ [(N.H+N.L)- N.R]
• R.H- rightly answered in highest group
• R.L- rightly answered in lowest group
• N.H- no of examinees in highest group
• N.L- no of examinees in lowest group
• N.R - no of non-response examinees

DISCRIMINATION INDEX (D.I)

“Index of discrimination is that ability of an item on the basis of


which the discrimination is made between superiors and
General guidelines for difficulty value (D.V)
inferiors"
 Low difficulty value index means, that item is high
Blood and Budd (1972)
difficulty one
TYPES OF DISCRIMINATION INDEX (D.I)
ex: D.V=0.20 » 20% only answered correctly for that item. So
that item is too difficult
+ Zero discrimination or No discrimination

 High difficulty value index means, that item is easy one


+ Positive discrimination

ex: D.V=0.80 » 80% answered correctly for that item. So that


+ Negative discrimination
item is too easy one
ZERO DISCRIMINATION OR NO DISCRIMINATION

 The item of the test is answered correctly or know the


answer by all the examinee's

 An item is not answered correctly any of the examinee


Relationship between difficulty value and discrimination
power
LESSON4.
 Both (D.V & D.I) are complementary not contradictory to PORTFOLIOASSESSMENT METHODS
each other
WHAT IS A PORTFOLIO?
 Both should considered in selecting good items
 A Portfolio is a purposeful collection of student work that
 If an item has negatively discriminate or zero exhibits that exhibits the students efforts, progress and
discrimination, is to be rejected whatever the difficulty achievement in one or more areas.
value
 The collection must include student participation in
CRITERIA FOR SELECTION AND REJECTION ITEMS selecting contents the criteria for selection the Criteria for
judging merit and evidence of student Self-reflection.
 Positive discrimination index only selected
Features And principles of portfolio
 Negative and zero discrimination index items are rejected
Portfolio Assessment Possess Several Features And Essential
 High and low difficulty value items are rejected Characteristics Which Are:

VALIDATION 1. A portfolio is a form of assessment that students do together


with their teachers. The teachers guide the students in the
PURPOSE: to determine the characteristics of the whole test planning, execution and evaluation of contents of the
itself, namely, the validity and reliability of the test portfolio .Together,they formulate the overall objectives for
constructing the portfolio. As such students and teachers in
VALIDITY interact in every step of the process in developing a Portfolio.

-Is the extent to which a test measure or as referring to the 2. A portfolio represent a selection of what the students believe
are best included from among the possible collection of things
appropriateness, correctness, meaningf ulness and usefulness of related to the concept being studied. It is the teachers
the specific decisions a teacher makes based on the test results responsibility to assist the students in actually choosing from
among a possible set of choices. To be included in the portfolio.
Three Main Types of Evidence that may be collected: However the final selection should be done by the students
themselves since the portfolio represents what the students
 Content-related evidence of validity believe are important considerations.

 Criterion-related evidence of validity 3. A portfolio provides samples of the students work which
show growth over time. By reflecting on their own learning
 Construct-related evidence of validity (self-assessment) students begin to identify the strengths and
weakness in their work. These weakness then become
RELIABILITY
improvement goals.
refers to the consistency of the scores obtained
4. The criteria for selecting and assessing the portfolio contents
must be clear to the teacher and the students. At the outset of the  SIXTH, PORTFOLIO ASSESSMENT CATERS TO
process. If the criteria are not clear at the beginning then there is INDIVIDUALS IN A HETEROGENEOUS CLASS. Such
tendency to include Among Unessential components in the a flexibility is attributed to the fact that Portfolio
portfolio and to Include those which happen to be available at Assessment is open-ended so that students can
the time. The portfolio is prepared at each step of the process, Demonstrate their abilities on their own level and caters to
the students need to refer to the agreed set of criteria for the differential learning styles and expression of varying
construction and development of the portfolio. strengths.

Purposes of portfolio Assessment  SEVENTH, DEVELOPS PORTFOLIO SOCIAL


SKILLS. ASSESSMENT STUDENTS THE INTERACT
WHY SHOULD WE RESORT TO PORTFOLIO WITH OTHER STUDENTS IN DEVELOPMENT OF
ASSESSMENT METHODS ? THEIR OWN PORTFOLIOS. Sometimes they are
assessed on work done in groups or in pairs so that they
 Portfolio assessment has several purposes and rationale for necessarily have to interact band collaborate to complete
its use. the Tasks.

 FIRST PORTFOLIO ASSESSMENT MATCHES  EIGHTH, PORTFOLIO ASSESSMENT DEVELOPS


ASSESSMENT OF TEACHING. The final outputs to be INDEPENDENT AND ACTIVE LEARNERS. Students
assessed are products of classroom discussions and must select and justify portfolio choices monitor progress
classroom work and are not simple diversions from the and set learning goals. Traditional testing cannot
tedium of classroom activities. achievement this educational objective no matter how
skillfully the tests are constructed.
 SECOND PORTFOLIO ASSESSMENT HAS ACLEAR
GOALS. IN FACT, THEY ARE DECIDED ON THE  NINTH, PORTFOLIO ASSESSMENT CAN IMPROVE
BEGINNING OF INSTRUCTION AND ARE CLEAR TO MOTIVATION FOR LEARNING AND THUS
TEACHER AND STUDENT ALIKE. In cognitive testing ACHIEVEMENT. When students are empowered to prove
the objectives are set at the beginning but the actual items their own achievement and worth they become highly
may or may not reflect achievement of such objectives In motivated to pursue the learning tasks. It is when they lose
portfolio assessment how ever the students control the this feeling of empowerment that they feel inadequate and
items to be included and therefore are assured that the become less motivated as in traditional classroom setting.
goals are achieved.
 TENTH, PORTFOLIO ASSESSMENT PROVIDES
 THIRD, PORTFOLIO ASSESSMENT GIVES A OPPORTUNITY FOR STUDENT-TEACHER
PROFILE OF A LEARNER ABILITIES IN TERM OF DIALOGUE. It enables the teacher to get to know every
DEPTH, BREADTH, AND GROWTH. Portfolio student. Moreover, Portfolio Assessment promotes joint
assessment enable the students to demonstrate quality goal-setting and negotiation of grades which can never
work done without pressure and constraints of time happen in traditional setting.
presents in traditional testing through the help of resource.
Essential Elements of the portfolio
 FOURTH, PORTFOLIO ASSESSMENT IS A TOOL
FOR ASSESSING A VARIETY OF SKILLS NOT EVERY PORTFOLIO MUST OBTAIN THE
NORMALLY TESTABLE IN A SINGLE SETTING FOR FOLLOWING ESSENTIAL ELEMENTS:
TRADITIONAL TESTING. The portfolio can show
written, oral and graphics output of students in a variety of 1. Cover Letter “About the author” and “What my portfolio
way which demonstrate skills developed by students. shows about my progress as a learner" (written at the end, but
put at the beginning). The cover letter summarizes the evidence
 FIFTH PORTFOLIO ASSESSMENT DEVELOPS of a student's learning and progress.
AWARENESS OF OWN LEARNING STUDENTS.
Students have to reflect on their own progress and the 2. Table of Contents with numbered pages.
quality of their work in relation to known goals. This is
achieved at each stage of the progress since the students 3. Entries both core (items students have to include) and
continually refer to the set of goals and objectives set at the optional (items of students choice). The core elements will be
beginning. required for each student and will provide a common base from
which to make decisions on assessment. The optional items will
allow the folder to represent the uniqueness of each student. Stage 3: Specification of Portfolio Content
Students can choose to include “best” pieces of work, but also a
piece of work which gave trouble or one that was less  Specify how much to be included in the portfolio both core
successful, and give reasons why. and options (it is important to include options as these
enable self expression and independence).
4. Dates on all entries, to facilitate proof of growth over time.
 Portfolio entries can take many form written, audio, video
5. Drafts of aural/oral and written products and revised versions; recorded, items, artifacts e.g. drawing, model etc.
i.e., first drafts and corrected/revised versions.
Stage 4: Giving clear and detailed guidelines for portfolio
6. Reflections can appear at different stages in the learning presentation
process (for formative and/or summative purposes.) and at the
lower levels can be written in the mother tongue or by students  Present as many evidence of learning as the students left
who find it difficult to express themselves in English. on their own.

For each item – a brief rationale for choosing the item should be  Explain the need for:
included. This can relate to students' performance, to their
feelings regarding their progress and/or themselves as learners.  clear and attractive presentation
Students can choose to reflect upon some or all of the following:
 dated drafts
 What did I learn from it?
 attached reflections or comment cards
 What did I do well?
Stage 5 : Informing key school officials parents and others
 Why (based on the agreed teacher-student assessment stake holders.
criteria) did I choose this item?
 Make sure that the school principal is aware of your
 What do I want to improve in the item? new assessment procedures.
 It is also a good idea to inform parents about the
 How do I feel about my performance?  portfolio assessment and allow them to comment on
the work
 What were the problem areas?
Stage 6: Development of the portfolio
STAGES INCIMPLEMENTINGCPORTFOLIO
ASSESSMENT  Support and encouragement are required by both teacher
and students at this stage.
Stage 1:Identify teaching goals to Assess through portfolio
 Devote class-time to student-teacher conferences, to
 Organizing portfolio assessment is to establish the teaching practicing reflection and self-assessment and to portfolio
goals. preparation.

 To be clear about the teachers hopes to achieve in  Give guiding feedback


teaching.
 Ownership: To ensure that the portfolio represents the
 Guide the selection and assessment of the students. student's own work

Stage 2: Introducing the Idea of Portfolio Assessment to Guide for Self-reflections and Self- assessment:
your class
 What did I learn from that activity?
 Introduce the concept to the class.
 Which is my best piece?
 Explaining the meaning of the word "portfolio”.
 How can I improve this?
 The portfolio assessment will assess the learners in a much
fairer way than the traditional testing method.  Brainstorming
 Portfolio partners

TYPES OF PORTFOLIO

DOCUMENTARY PORTFOLIO

As the name implies, this approach involves a collection of


work over time showing growth and improvement reflecting
students learning of identified outcomes. This portfolio is called
a growth portfolio in the literature.

The Collection becomes meaningful when specific items are


selected out to focus on particular educational experiences or
goals. ASSESSMENT TOOLS:

Process Portfolio  Self/peer assessment with rating scales

The process portfolio in contrast demonstrates all facets or  Checklist with criteria (such as: clear presentation, relevant
phases of the learning process. As such these portfolio contain vocabulary, correct spelling/pronunciation), depending on
an extensive number of reflective journals think logs and other the task
related forms of metacognitive processing.
 Teacher/peer observation
Showcase portfolio  Learning log

The showcase portfolio only shows the best of the students  Answer key
outputs and products.
 Guided reflection on the task
Assessing and Evaluating the Portfolio
One of the more significant aspects of Portfolio assessment is its
According to Paulson ,Paulson and Meyer Portfolios offer a way "Collaborative Approach" in which students and teachers work
of assessing student learning that is different form the traditional together to identify especially to significant or important
methods. Portfolio assessment provides the teacher and students artifacts and processes to be capture in the portfolio.
an opportunity to observe students in a broader context taking
risk developing creative solution and learning to make Student-Teacher Conferences
judgments about their own performances.
The main philosophy embedded in portfolio assessment is
PORTFOLIO INCLUDE THE FOLLOWING: "Shared and active assessment”. The teacher should have short
individual meetings with each students, in which progress is
 Thoughtfulness (including evidence of students monitoring discuss and goals are set for a future meeting.
of their own comprehension metacognitive reflection and
productive habits of mind. The teacher and the student keep careful documentation of the
meeting noting significant agreements and finding each
 Growth and development in relationship to key curriculum individual session. The formative evaluation process of the
expectancies and indicators. portfolio assessment is facilitated. Indeed the use of portfolio
assessment takes time but in the end it gains
 Understanding and application of key processes. O
Completeness correctness and appropriateness of products Finally, student-teacher conference can also be used for
and processes presented in the portfolio summative evaluation purposes when the students present his
final portfolio product and where final grades are determined
 Diversity of entries (e.g, use of multiple formats to together with the teacher. This conference can be prepared in
demonstrate achievement of designated performance pairs, where students practice presenting their portfolio.
standards.

You might also like