Designing An Effective Evaluation Strategy

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 17

Designing an Effective Evaluation Strategy

Program Evaluation defined

Evaluation is the systematic application of scientific methods to assess the design,


implementation, improvement or outcomes of a program (Rossi & Freeman, 1993; Short,
Hennessy, & Campbell, 1996). The term "program" may include any organized action such as
media campaigns, service provision, educational services, public policies, research projects, etc.
(Center for Disease Control and Prevention [CDC], 1999).

Why do we do Evaluation?

 Establish model programs and best practices by providing feedback about what
worked and what failed
 Tool of good management and quality improvement - gain insight into effective
strategies on how to improve performance
 Measures impact the program is making
 Required by funder

Purpose of Program Evaluation


 Demonstrate program effectiveness to funders
 Improve the implementation and effectiveness of programs
 Better manage limited resources
 Document program accomplishments
 Justify current program funding
 Support the need for increased levels of funding
 Satisfy ethical responsibility to clients to demonstrate positive and negative effects of
program participation (Short, Hennessy, & Campbell, 1996).
 Document program development and activities to help ensure successful replication
 Effective evaluation is not an "event" that occurs at the end of a project, but is an
ongoing process which helps decision makers better understand the project; how it is
impacting participants, partner agencies and the community; and how it is being
influenced/impacted by both internal and external factors.

 W.K. Kellogg Foundation Evaluation Handbook, p. 3

LOGIC MODEL connection

Framework for Evaluation


Underlying Logic of Steps

 No evaluation is good unless… results are used to make a difference


 No results are used unless… a market has been created prior to creating the product
 No market is created unless…. the evaluation is well-focused, including most relevant
and useful questions

Program Planning Process

Evaluation is part of a larger program planning process. First, you plan the program. Then you
implement it. Then you evaluate it. You use what you learn from the evaluation to improve your
program. And then you start planning your improved program.

Continuous Quality Improvement (CQI) cycle.

These must be integrated…


 Planning —what actions will best reach our goals and objectives.
 Performance measurement — how are we doing?
 Evaluation —why are we doing well or poorly?
Evaluation Flowchart

Types of Evaluation

 Planning (involves examining the developmental issues prior to setup)


 Process or Formative (involves monitoring the “process,” ensuring activities are
completed on time and on target, while the program is ongoing)
 Tells you if you’re on track
 Points to improvement
 Outcome or Summative (involves assessing the outcome at the conclusion of the program
and measures how change that has occurred as a result of the program)
 Shows what impact you have on problem
 Helps justify program
 Cost Benefit

Planning Evaluation
 Why is the program needed?
 Identify target population
 Who needs to be involved in the planning?
 Identify key players/agencies
 What are the goals of the program?
 Identify goals from perspective of various key stakeholders/agencies
 What resources are necessary?
 Identify financial resources
 Identify non-financial resources
 What is the timeline?
 Determine timeline to program implementation

Planning and Process Evaluation Goals

1)To examine developmental issues prior to setup


2) To assess the steps that occur within the program set-up phase. Key questions to
answered through this type of evaluation include:
 How was the program set-up initiated?
 What agencies are involved in its daily operation?
 How were collaborations developed, and how are they sustained?
 Overall, how does the program function to serve the clients?
 What resources (both financial and non-financial) are available/needed?
 What is the timeline for the project?

Process Evaluation Data – Types and Collection


 TYPES: (Which is more useful?)
 Quantitative
 Qualitative
 How to collect data for process evaluation:
 Interviews
 Focus groups
 Observation
 Questionnaires/Surveys
 Analysis of existing documents
Outcome Evaluation

 Outcome evaluations are sometimes referred to as “true” or “real” evaluation


 Outcome evaluation is no more “true” or “real” than process evaluation
 Indeed, a quality process evaluation is necessary if one is going to say what it is about the
program that produced the results

Outcome Evaluation Data – Types and Collection


 The essence of program outcome evaluation is comparison (i.e. multiple data collection
points to compare). Assessment of change over time. Done through utilization of:
 Control group
 Comparison group
 Pre- and post-test
 The data collected is most often quantitative but may be qualitative given the
nature of the program or project.

The Four Standards of Evaluation


No one “right” evaluation. Instead, best choice at each step is options that maximize:
 Utility: Who needs the info from this evaluation and what info do they need?
 Feasibility: How much money, time, and effort can we put into this?
 Propriety: Who needs to be involved in the evaluation to be ethical?
 Accuracy: What design will lead to accurate information?

Step-by-Step Evaluation Design


1. Engage stakeholders: Decide who needs to be part of the design and implementation of the
evaluation for it to make a difference.
2. Describe the program: Draw a “soup to nuts” picture of the program— activities and all
intended outcomes.
3. Focus the evaluation: Decide which evaluation questions are the key
Seeds of Steps 1-3 harvested later:
4. Gather credible evidence: Write Indicators, then choose and implement data collection
sources and methods
5. Justify conclusions: Review and interpret data/evidence to determine success or failure
6. Use lessons learned: Use evaluation results in a meaningful way.

Designing an Evaluation Plan


 STEP 1: Engage Stakeholders
 Who are major stakeholders for our efforts
 Where in this model doe they want to see success?
 Who needs to be engaged upfront to ensure use of results?
Positive Evaluation Model

 Evaluation is systemic determination of merit, worth and significance of something or


someone using criteria against a set of standard
 Evaluation is to draw; to assess; to compute an expression
 Evaluation is the process of making judgment based on criteria and evidence
Evaluation is the process of examining a subject and rating it based on its important
features

Why to Evaluate?
· Accountability
· Validating our hypothesis
· Comparison
· Knowing Status
· Knowing Needs
· Planning further
· PURPOSE……...........
· Measures the effectiveness of the instructor
· Measures the effectiveness of impact in meeting objectives
· Provide feedback to students
· Provides students gratification and motivation...........
Quality Evaluation
Criteria - Certain standard on which the achievements of a learner is measured

Good Quality Criteria

· Validity

- Reliability

· Reproducibility

· Sensitive

· Specific
Quality of Evaluation

· Quantitative Evaluation

· Provide a quantifiable objective measure

· Expressed in proportions

· Example:

· How many students have got >60%?

· Qualitative Evaluation

· Communicate general expectations

· Expressed in grading

· Open to interpretation

· Examples:

· What about his socio-economic status ?

· Formative Evaluation

· Ongoing evaluation during an instructional period

· To know the perceptions of the students in comparison to instructor.

· Summative Evaluation

· Conducted at the end course.

· Purpose is to form a judgment about

· Performance of student

· Effectiveness of an instructor

· Effectiveness of the course

· Regularly scheduled at the end of academic terms.


Formative v/s Summative Evaluation

Quality Formative Summative

Purpose detect strengths & weakness Overall achievements

Frequency During or end of unit In end – point of certification,


promotion

Area covered One unit/no. of units Course content

Administrative Advisory, not always for Decisive, for permanent


utility permanent record record

Feedback to Done immediately Inform regarding pass or fail


students

Feedback to If significant no. shows error Overall pass or fail


faculty than weakness in instruction

Pre- and Post-Evaluation

 Evaluate in the beginning to asses needs

 Evaluate in the end to assess outcome

 To assess degree of achievement of objectives through pre-post evaluation.


Evaluation Questions should Include Each level of Evaluation

*Levels of Evaluation
 Level I – Reaction
 How did the student react to the class?
 Level - II Learning √
 What has the student learned?
 Level III – Skill √
 How much did the student retain?
 Level – IV Impact √
 What is the final impact or practical application of this
learning?
Techniques of Evaluation
1) Teaching dossiers
2) Student ratings
3) Peer observations
4) Interviews
5) Portfolios
6) Classroom Assessment
· Evaluation should use a combination of strategies to take advantage of their inherent
strengths as well as their individual limitations.
· Teaching Dossiers
· Usually done in the end of unit and in the form of some written document.
· As an submission assessment of students
· Student Rating
· Rating of teaching by students
· Peer observations
· Colleagues judge teaching quality by observation
· Peer observation is usually useful for formative evaluation
· Interviews
used in teaching award nominations
· Portfolios
· Appropriateness of course goals and objectives
· Classroom Assessment
· Effectiveness of teaching on Learning
· can be used in a timely way to help instructors identify gaps
Methodologies
 Project Assignments
 Observation
 Simple Observation
 Think-aloud (let the examinee explain)
 Constructive Interaction
 Query via
 Interviews
 Focal Group Discussion (FGD)
 Questionnaires
Combinations of methodologies should be used

Summary of Methodologies
Styles of Questions
 Open-ended questions
 Closed
 Scalar
 Multi-choice
 Ranked
 Combining open-ended and closed questions

Open-ended questions
 Asks for unprompted opinions
 Good for general subjective information
 But difficult to analyse
eg: Can you suggest any improvements to the present health care delivery system?

Closed questions
 Restricts the respondent’s responses by supplying alternative answers
 Can be easily analyzed
 But watch out for hard to interpret responses!
 Alternative answers should be very specific
Scalar
 Ask user to judge a specific statement on a numeric scale
 Scale usually corresponds with agreement or disagreement with a statement
Multi-choice
 Respondent offered a choice of explicit responses
e.g. Which types of presentation is this.?
a) Microsoft word
b) Microsoft power point
c) Microsoft excel
d) Microsoft outlook
Ranked
 Respondent places an ordering on items in a list
 Useful to indicate a user’s preferences
e.g. Rank the usefulness of 108 services in health sector.
a) Useful
b) Do not know
c) Of no use
Combining open-ended and closed questions
 Gets specific response, but allows room for user’s opinion
e.g. Compare and comment on IMR in developing with developed country
Tests
 Written
 Questionnaire
 Practical
 Project Assignment, Case Study, Observation,
Interview, FGD
 Oral
 Interview, FGD
Written Tests
 Multiple Choice
 True/False
 Matching
 Completion or fill in the blank
 Essay
 Short Notes
 Long Essay
Multiple Choice Tests
 Common method
 Easy to grade and be objective
 Test construction miscues:
 Using previous questions information
 Negatively worded stems
 Fill in the blank in middle of stem
 Using ‘all of the above” or “none of the above”
True/False Tests
 Limited to two answers, no gray area
 Difficult to construct in positive voice
 Avoid always or never statements
 Useful tool as a study guide
Matching Tests
 Works best with definitions and terms
 Difficult to design
 Cautious of multiple matches
 Test directions must be clear
Completion Tests
 Fill in the blank
 Statements must be clear as to intent of question
 Need to be grammatically correct
 Be aware of size of blank
 Avoid having blank at beginning of sentence
Essay Tests
 May require long or short answer
 Time consuming and difficult to grade
 Recommended to grade in a group format
 Hand written exams must allow sufficient time
Shotgun approach.. As much information as possible in hopes of hitting the target
Oral Exams
 Requires verbal answers by students
 Advantages
 Evaluate quick reaction of student
 Assesses the student thought process
 Disadvantages
 Limited number of students examined at one time
 Difficult to standardize
 Time consuming and labor intensive
 Unexpected distractions
 Unfair emphasis on repeated mistakes
Project Assignments
 Gets students working outside the class
 In groups, helps develop people skills
 Negatives
 Hard to standardize
 Potential plagiarism
 May measure only end product and not consider the process
Practical Exams
 Demonstration of a skill in the context of a scenario
 Demonstration of steps of performing a skill

You might also like