Planning A Program Evaluation

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 27

G3658-1

Program Development
and Evaluation

Planning a Program Evaluation


Ellen Taylor-Powell
Sara Steele
Mohammad Douglah

February 1996
Acknowledgements
For their timely and thoughtful review of this
publication, the authors wish to thank
Phil Holman, Stephen Kluskens, Greg Lamb,
Joan LeFebvre, Dave Sprehn, Nancy
Stoutenborough, Jack Trzebiatowski and
Kathi Vos.
■ ■ ■ TABLE OF CONTENTS

Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

Focusing the evaluation . . . . . . . . . . . . . . . . . . . . 2


What are you going to evaluate?. . . . . . . . . . . . . . . . . . . 2
What is the purpose of the evaluation? . . . . . . . . . . . . . . . 3
Who will use the evaluation? How will they use it? . . . . . . . . 3
What questions will the evaluation seek to answer?. . . . . . . . 5
What information do you need to answer the questions?. . . . . 8
Indicators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Kinds of information: numerical and narrative . . . . . . . 10
When is the evaluation needed? . . . . . . . . . . . . . . . . . . 10
What resources do you needÑtime, money, people? . . . . . . 10

Collecting the information . . . . . . . . . . . . . . . . . . 11


What sources of information will you use? . . . . . . . . . . . . 11
What data collection method(s) will you use? . . . . . . . . . . 11
What collection procedures will you use? . . . . . . . . . . . . . 12

Using the information . . . . . . . . . . . . . . . . . . . . . 13


How will the data be analyzed? . . . . . . . . . . . . . . . . . . 13
How will the information be interpretedÑBy whom? . . . . . . 14
How will the evaluation be communicated and shared? . . . . 14

Managing the evaluation . . . . . . . . . . . . . . . . . . . 15


Implementing the plan: timeline and responsibilities . . . . . . 15
Budget . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Finalizing the plan . . . . . . . . . . . . . . . . . . . . . . . . 15
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Appendices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2 ■ ■ ■ P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N

Introduction Focusing the


There is no blueprint or recipe for conducting a
good evaluation. Because the term evaluation is
evaluation
subject to different interpretations, a program What are you going to evaluate?
can be evaluated in a variety of ways. Many efine what you intend to evaluate. This may
Extension professionals evaluate their pro-
grams informally on an ongoing basis through
D or may not be easy depending upon how
clearly defined your program is. For example,
casual feedback and observation. The resulting you might wish to evaluate a financial man-
information is often very useful and may be all agement program. Think about the programÕs
that is needed to keep a program relevant and purpose and content. Do you want to examine
operating efficiently. You can enhance the the whole program or just a particular compo-
value of information garnered from an evalua- nent of it? Briefly describe what you want to
tion however, if you devote sufficient fore- evaluateÑpurpose, expected outcomes,
thought and planning to the evaluation intended beneficiaries and activities. What is
process. As the term is used in this guide, the planned link between ExtensionÕs inputs
program evaluation refers to the thoughtful and the desired outcomes?
process of focusing on questions and topics of
concern, collecting appropriate information, The evaluation effort should fit the program-
and then analyzing and interpreting the infor- ming effort. In some cases, you may only be
mation for a specific use and purpose. interested in finding out how people
responded to your teaching style, or how satis-
This guide is designed to help you plan a fied they were with a particular event. In other
program evaluation. It is organized into four cases, you may want to document behavioral
major sections: changes or impacts which require a more com-
■ Focusing the evaluation prehensive evaluation and level of effort. The
■ Collecting the information point is to tailor your evaluation to fit the
program. DonÕt expect to measure impact from
■ Using the information
a single workshop or behavioral changes from
■ Managing the evaluation a limited media effort.
Each section presents a series of questions and
Remember, not all extension work needs to be
considerations for you to adapt to your own
formally evaluated. Formal evaluations require
needs and situation.
time, money and resources. Sometimes pro-
YouÕll discover that evaluation is more than grams lack sufficient substance to warrant a
just collecting information. It involves serious formal evaluation, or it may be too costly to
reflection on questions such as: collect the evidence needed to demonstrate
■ What is the purpose of the evaluation? impact. It may be possible that no one is inter-
ested in the findings.
■ What do I want to know?
■ What do I intend to do with the We also need to consider our clientele. People
information? get tired of filling out end-of-session forms or
answering surveys. Be selective, considerate
Answers to these questions are crucial if your
and think about what is needed and what will
evaluation is to produce useful information.
be used.
This guide will help you think about and
answer these and other questions as you plan a
program evaluation.
P L A N N I N G A N E V A L U A T I O N ■ ■ ■ 3

What is the purpose of the Who will use the evaluation?


evaluation? How will they use it?
he fundamental purpose of evaluation is to ave you ever completed an evaluation and
T create greater understanding. Within
Extension, program evaluations are conducted
H asked, ÒWhat shall I do with it?ÓOr tallied
the results but never really used the findings?
largely to improve educational efforts and to If we want to collect relevant data and make
address accountability. In action, these pur- the best use of limited resources, we must
poses translate into more specific reasons for think about how weÕll use our evaluation right
conducting an evaluation. from the start.
What is the purpose(s) of the evaluation you Sometimes, we conduct an evaluation only for
propose? For example, will it: our own use. Usually, however, there are
■ Help others (taxpayers, administrators, others who have requested or could use the
participants, colleagues, committee resulting information. Any of the groups listed
members) understand the program and its below might be interested in the evaluation
results? results of an Extension program.
■ Improve the program? ■ People affected in some way by the
■ Improve your teaching? program (either directly or indirectly) such
as program participants, nonparticipants,
■ Measure whether the Extension program critics
made a difference in peopleÕs lives?
■ County board members, elected officials
■ Determine if the program is worth the
cost? ■ Community leaders

■ Answer questions posed by funders and ■ Colleagues, volunteers, collaborators, sup-


influential members of the community? porters

■ Meet rank and tenure requirements? ■ Extension administrators

■ Meet administrative requirements? ■ Media

■ Other? ■ Tenure committees


■ Grantors
It is important to clearly articulate the evalua-
tionÕs purpose. Otherwise, it will lack direction ■ Agencies, firms, interest groups
and the resulting information will not be as Identify potential users of the information. Find
valuable as it could be. out what they want to know, and how they will
use the information (see table 1). If you donÕt
know, ask. This will help you to clarify the
purpose(s) of the evaluation, build commit-
ment for it and fine-tune the particular ques-
tions the evaluation will address.
4 ■ ■ ■ P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N

Table 1. Who wants to know what? How will the information be used?

Who might use What do they want How will they use
the evaluation* to know? the results?

You Is the program meeting To make decisions about


clientele needs? modifying the program
Is my teaching effective? To influence decisions about
tenure or merit
County board Who does the program serve? To make decisions about budget
Is the program cost-effective? allocations
Professional Are you an effective educator? To make rank and promotion
review committee decisions
Extension administration Has the program achieved To justify extension programs
its expected outcomes? and ensure financial support
How effective are To decide about staff
the extension faculty? training and achievements
Clientele Is the extension program To determine whether to participate
meeting their needs? in other extension programs

*Examples of broad user categories are listed here. Be as specific as possible when you identify potential users and
their interests.

Involving others Often, however, when we include users, it is as


As with program planning, involving intended ÒhelpersÓor Òdata collectors,Ó while we remain
users of the information in an evaluation leads in control of the evaluation. Alternative
to greater commitment to the evaluation approaches in the field of evaluation aim to
process, helps ensure that relevant questions change the center of control. Participatory
are asked, and increases the chances that find- approaches and empowerment evaluation enable
ings are listened to and used. people to conduct and use their own evalua-
User input may be included throughout the tions.
entire evaluation process or just at specific An appropriately constituted advisory group
stages, such as when you set the evaluationÕs or a co-sponsor can be a strong asset. These
focus, determine the information needs, or parties can serve as advocates for the evalua-
collect and interpret data. In recent years, con- tion, see that tasks are completed and help
siderable emphasis has been placed on involv- make resources available. As a result, more
ing stakeholders as partners in the evaluation people respond, the findings receive more
process to ensure that the information col- attention, and the results are disseminated
lected is relevant and that there is a commit- more widely.
ment to use it.
P L A N N I N G A N E V A L U A T I O N ■ ■ ■ 5

What questions will the evaluation seek to answer?


ake a list of the questions and topics that carried out. Defining appropriate questions to
M you and the individuals or groups you
have listed want to address. As you do so,
be answered by an evaluation depends upon
your knowledge of the program.
review the programÕs elements. Sometimes
The following table lists some typical ques-
programs change as they are implemented;
tions raised in Extension circles.
sometimes not all the intended activities are

Table 2. Questions raised about extension programs

About outcomes/impacts ■ What resources and inputs are invested?


■ What do people do differently as a result ■ How many volunteers are involved and
of the program? what roles do they play?
■ Who benefits and how? ■ Are the financial and staff resources ade-
■ Are participants satisfied with what they quate?
gain from the program?
About program context
■ Are the programÕs accomplishments ■ How well does the program fit in the
worth the resources invested? local setting? With educational needs and
■ What do people learn, gain, accomplish? learning styles of target audiences?
■ What are the social, economic, environ- ■ What in the socio-economic-political
mental impacts (positive and negative) environment inhibits or contributes to
on people, communities, the environ- program success?
ment? ■ What in the setting are givens and what
■ What are the strengths and weaknesses can be changed?
of the program? ■ Who else works on similar concerns? Is
■ Which activities contribute most? Least? there duplication?
■ What, if any, are unintended secondary ■ Who are cooperators and competitors?
or negative effects? About program need
■ How well does the program respond to ■ What needs are appropriately addressed
the initiating need? through Extension education?
■ How efficiently are clientele and agency ■ What are the characteristics of the target
resources being used? population?
About program implementation ■ What assets in the local context and
■ What does the program consist ofÑactiv- among target groups can be built upon?
ities, events? ■ What are current practices?
■ What delivery methods are used? ■ What changes do people see as possible
■ Who actually carries out the program or important?
and how well do they do so? ■ Is a pilot effort appropriate?
■ Who participates in which activities?
Does everyone have equal access?
■ What is ExtensionÕs role; the contribu-
tions of others?
6 ■ ■ ■ P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N

As you think about the questions you want


answered, it may help to review the Bennett
hierarchy (see table 3). First used in the 1970s,
it continues to be updated and used in
Extension for planning and evaluation. The
three lowest levels concern program imple-
mentation while the four upper levels deal
with program results. The logic of the hierarchy
is that in Extension programs we expend:
1. resources to conduct Table 3. Bennett’s hierarchy of evidence for
2. activities intended to obtain extension program evaluation
3. participation among targeted audiences.
7. ImpactÑSocial, economic, environmen-
4. ParticipantsÕreactions to program
tal conditions intended as end results,
activities affect their
impacts or benefits of programs; public
5. learningÑknowledge, opinions, skills and private benefits.
and aspirations. Through learning,
6. ActionsÑPatterns of behavior and pro-
people take
cedures, such as decisions taken, recom-
6. action which helps achieve mendations adopted, practices imple-
7. impactÑsocial, economic, environmen- mented, actions taken, technologies
tal change. used, policies enacted.
Tips 5. LearningÑKnowledge (awareness,
■ Evidence of outcomes is stronger as you go understanding, mental abilities); opin-
up the hierarchy. ions (outlooks, perspectives, view-
points); skills (verbal or physical abili-
■ Difficulty and cost of obtaining evidence
ties); aspirations (ambitions, hopes).
increases as you go up the hierarchy.
4. ReactionsÑDegree of interest; feelings
■ Evaluations are strengthened by showing
toward the program; positive or negative
evidence at several levels of the hierarchy.
interest in topics addressed, acceptance
■ Information from the lower levels helps to of activity leaders, and attraction to edu-
explain results at the upper levels which cational methods of program activities.
are longer-term.
3. ParticipationÑNumber of people
Admittedly, the Bennett hierarchy is a simpli- reached; characteristics/diversity of
fied representation of programs and does not people; frequency and intensity of
indicate the role that larger social, economic contact/participation.
and political environments play in extension 2. ActivitiesÑEvents, educational methods
program delivery. But using this hierarchy can used; subject matter taught; media work,
help to describe a programÕs logic and promotional activities.
expected links from inputs to end results. It
1. ResourcesÑStaff and volunteer time;
can be useful in deciding what evidence to use
salaries; resources used: equipment,
and when. For example, a program may show
travel.
evidence of accomplishments at the first five
Source: Bennett and Rockwell, 1995. Targeting Outcomes
levels long before practices are changed,
of Programs (TOP); slightly modified.
actions are taken or long-term community
improvements are made.
P L A N N I N G A N E V A L U A T I O N ■ ■ ■ 7

Many Extension professionals are concerned Clarifying the evaluation question(s)


with outcomes and with documenting evi- As you think about the questions that your
dence that their programs make a difference in evaluation will answer, it may be necessary to
peoplesÕ lives. But understanding outcomes break a larger question into its component
requires more than just documenting end parts. This will help you fully answer the
results. When that happens, we are left with broader question and begin to identify the
what has been called the Òblack boxÓ approach information you need to collect. Consider the
to evaluationÑwe record outcomes but we following examples:
donÕt know what led to them. For example, Main question: Who benefits from the
using a pre-test and post-test demonstrates program?
that something happened (we hope)Ñnot how
Sub-questions: Who actually participates in
or why or the role that Extension played. What
the program? At what level of
were the activities? What contribution did
involvement?
Extension make? What factors in the socio-
economic context or implementation process Who else gains from the
influenced the outcomes? Identify those parts program? What do they gain?
that need to be explored relative to your How do program participants
program and situation. At the minimum, docu- compare to the county popula-
menting ExtensionÕs role and resource invest- tion in general?
ments is critical to most evaluations. Who may be negatively
Extension plans programs to have certain posi- affected? How?
tive outcomes. However, unanticipated events Main question: Is the program duplicating
may occur that result in positive, negative or other efforts?
neutral outcomes. For example, a program to Sub-questions: Of what does the program
develop a recreational center for youth may consist?
result in an increase in street litter and noise;
or, an economic development program may What other similar programs
existÑof what do they
result in new investors coming to town who
consist?
displace local businesses. Or, there may be
unexpected positive benefits which are as How are aspects of these
impressive or more impressive than the programs alike? Dissimilar?
planned outcomes. Think about what some Complementary?
other effects of your program might be. Create What is our particular
an evaluation that will stay tuned to unex- expertise/niche?
pected results. Main question: Did people learn the impor-
tance of X?
Sub-questions: Did people know anything
about X before attending the
program?
Was the environment con-
ducive to learning?
Do any other programs or
agencies promote the
importance of X?
8 ■ ■ ■ P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N

It will probably be necessary to prioritize the What information do you need to


evaluation questions. Try to distinguish answer the questions?
between what you need to know and what nce youÕve identified the key questions, you
might be merely nice to know. Focus on the
key questions that are most important. When
O (and those you are involving in the evalua-
tion) can begin the creative task of figuring out
prioritizing, consider time, resources and the how to answer those questions. Not all perti-
availability of needed assistance compared to nent information can be specified in advance,
the value of the information that might be col- but much can and should be.
lected. As needed, bring stakeholders together
and negotiate a practical list. Above all, keep Indicators
the evaluation manageable. It is better to stay An indicator expresses that which you wish to
focused and answer a few questions well. know or see. It answers the question, ÒHow
will I know it?Ó It is the indication or observ-
Define key terms able evidence of accomplishments, changes
You may need to define certain terms. For made, or progress achieved. Indicators are the
example, ÒimpactÓÑwhat does it mean? In measurements that answer your evaluation
terms of what? On whom? You may define questions. Some examples are provided in
impact from your perspectiveÑwhat you table 4.
wanted to have happen (look back at your
desired outcomes). But also, check in terms of As we know, leadership is a complex phenome-
what the participants themselves see as the non that can be displayed in many ways. Ask
impact. Our definitions and program partici- yourself (and others)Ñwhat does leadership
pant definitions are not necessarily the same. mean? How would I recognize it if I saw it? If
our evaluation seeks to document the impact
our program has on developing leadership
skills, we first must identify those actions
which indicate that a person is demonstrating
improved leadership. If our evaluation
Case example. Program staff defined and purpose is to determine the impact a youth
evaluated their outcome of a bilingual nutrition development program has on developing
education program as knowledge gained capable youth, we first must define what we
(nutrition knowledge gained, proficiency in mean by capable youth and list the characteris-
language). An evaluation showed little, if any, tics that identify them. These are the indicators
gains in knowledge. you should use to measure the outcome.
Upon further probing, it was found that the Often, a number of indicators are needed to
participants were very satisfied with the express an outcome more accurately. You will
program. For them, it had been very success- need to clearly define indicators that are
ful because, at its conclusion they were able appropriate to your program or use those
to shop with greater confidence and ease, developed and tested elsewhere. In ideal prac-
saving time. Staff-defined definitions of out- tice, indicators are written during program
comes missed some important benefits as planning. An example showing indicators for
perceived by the participants. different levels of an extension program is
found in Appendix A.
P L A N N I N G A N E V A L U A T I O N ■ ■ ■ 9

Table 4. Examples of indicators

Evaluation question How I will know it? The indicators

Has the expected change in ¥ Ability to negotiate when group disagrees


leadership capabilities occurred? ¥ Improved listening skills
¥ Ability to maintain balance between process
and task activities
¥ Anything else?

Is water quality improving? ¥ Less pollution as a result of improved nutrient management


¥ Balanced species composition in lakes
¥ Anything else?

Are young people learning to ¥ Increased confidence in expressing ideas clearly


communicate effectively? ¥ Improved verbal and non-verbal communication skills
¥ Improved listening skills
¥ Anything else?

Was the collaboration successful? ¥ Actions taken as a result of the collaboration


¥ Membership from each segment of the
community affected
¥ Roles, rights and responsibilities clearly delineated
¥ Communication is open and frequent
¥ Anything else?

Remember, a program outcome may mean dif- Also, remember that when you collaborate
ferent things to different people. Therefore, the with other agencies, they may use different
expression of that outcome may be different. indicators. The Department of Natural
For example, volunteers and youth who expe- Resources, for example, might measure water
rience violence in their daily lives are likely to quality in terms of bio-physical characteristics,
characterize capable differently than a rural 4-H the local Economic Development Group may
club member. Hmong or Native Americans measure healthy community in terms of number
may designate different attributes for leadership of jobs per capita, and the Feed ProducerÕs
than participants of Hispanic or European Association may measure agricultural profitabil-
origin. Wherever possible, try to understand ity as cost/return ratios.
the meaning of the program and its outcomes
from the participantsÕ perspectives. Include
those meanings as the indicators for measuring
success.
10 ■ ■ ■ P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N

Kinds of information: numerical What resources do you need—


and narrative time, money, people?
Numerical and narrative data both serve a
he resources you have available may influ-
useful purpose in evaluation. The choice for
using one or both types of information
T ence your evaluation plan more than any
other single factor. Even if you expect to inte-
depends upon what you want to know about
grate evaluation into the program or if the
the program and who will receive the informa-
evaluation will be conducted by volunteers or
tion. Ultimately, the value of the information
the participants themselves, you will need to
depends upon how it is viewed by your audi-
allocate time for planning. Balancing your
ence. Most evaluations include a mix of
expectations (and those of others) with what is
numerical and narrative information. The exact
realistic and manageable is a challenge. YouÕll
mix depends upon the preferences and back-
need to consider:
grounds of the people to whom you will com-
municate. ■ Time. Whose time and how much of it is
available to work on evaluation? What pri-
Think about what type of data is most likely to
ority will evaluation have in your overall
be understood and viewed as credible by those workload? Involving volunteers or partici-
who will receive the evaluation. pants is a way to spread the workload, but
■ Will your evaluation audience be it may require time for preparation or
impressed with numbers and statistics? training.
■ Will your evaluation audience be ■ Money. Some activities require financing.
impressed with human interest stories and For example, what dollar resources are
examples of real situations? available to print questionnaires, pay for
■ Will a combination of numbers and narra- postage, reimburse participants, analyze
tive information be valuable? the data?
■ Expertise. Sometimes we need outside
When is the evaluation needed? expertise to help with certain parts of an
evaluation. Constructing the instrument or
Deadlines
analyzing the data may require such help.
When the information is needed and what you
Or, there may be others with a lot of experi-
can manage to do within the timeline will
ence and knowledge related to the program
influence the scope of your evaluation. You
from whom we could learn. Sometimes the
might decide to save some questions or con- involvement of an ÒoutsiderÓ increases the
cerns for another study, or to discard others as evaluationÕs credibility.
unnecessary or inconsequential. Try to develop
a realistic timeline for completing the evalua-
tion. Keep the components of your plan man-
ageable so they can be handled well within
those time limits.
Usable moments
You may not need evaluation information to
meet a specific deadline, but there might be
times when having such information would
serve a valuable purpose. A few examples of
such Òusable momentsÓ include important
committee meetings, testimony in front of
funders, pre-budget hearings, etc.
P L A N N I N G A N E V A L U A T I O N ■ ■ ■ 11

Collecting the What data collection method(s)


will you use?
information Method
Go back and review the questions you wish to
ask and the information needed to answer When you think about the type of method to
them. Now is the time to think about how you use for collecting the information, consider:
will collect the information. ■ Which method is most likely to secure the
information?
What sources of information ■ Which method is most appropriate given
will you use? the values, understandings and capabili-
Existing information ties of those who are being asked to
Remember, you donÕt always have to go provide the information?
out and collect new data. Explore possible ■ Which method is least disruptive to your
sources of existing information such as program and to your clientele? Asking
other agency records, previous evaluation questions can be intrusive, time-consum-
reports, local service and business reports, ing and/or anxiety-provoking.
WISPOP, the Census Bureau, etc. Printed ■ Which method can you afford and handle
material about the program (newspaper well?
articles, annual reports and updates, par-
The best way to collect data often depends
ticipant logs and records) may be a valu-
able source of information. upon an understanding of the social, cultural
and political environment.
People
■ Some participants may not feel comfort-
The most common source of information is
able responding over the telephone or in a
the programÕs participants and beneficia-
written format. You will need cultural sen-
ries themselves. However, a range of
sitivity to link an appropriate data collec-
potential ÒhumanÓ sources exist including
tion technique with diverse respondents.
nonparticipants, proponents and critics;
key informants (school principals, court ■ If, in the past, only 20% responded to your
judges, parents of participants, volunteer mail survey, you will need to decide
leaders, etc.Ñindividuals who are likely to whether that is an adequate representa-
know something about the programs and tion. Can you get more useful information
their effects); program staff and collabora- from another method?
tors; legislators; funders; and policy In Extension, weÕve tended to rely on surveys,
makers. tests and end-of-session questionnaires.
Be sure that the people you identify can Currently, focus group interviews are gaining
actually provide the information you are popularity. Altogether there are a variety of
seeking. techniques from which to choose (see table 5).
Observations Select the method which suits your purposeÑ
An underused, but powerful, source of donÕt let the method determine your approach.
information is the direct observation of Be creative and experiment with various
program events, activities and results. techniques.

Pictorial records
Another powerful source of information is
any pictorial record that shows program
activities and effects documented in
photos, charts, videotapes, and maps.
12 ■ ■ ■ P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N

Table 5. Common types of data collection Conduct a pilot test of the questionnaire or
methods used in evaluations of extension programs recording form with people similar to your
proposed respondents or recorders. Avoid the
temptation to only use office colleagues to pre-
Survey
test evaluation instruments and forms. They
Interview may understand what you intended when
Test
respondents will not. Pilot the instrument and
then discuss with the pilot respondents any
Observation uncertainties they might have had. This pro-
Group techniques vides a chance to eliminate potential problems.

Case study What collection procedures will


Photograph, videotape, slides you use?
Document review and analysis When will the data be collected?
■ Before and after the program. Examples:
Portfolio review
pre- and post-measures.
Testimonials ■ At one time. Examples: a single survey; an
Expert or peer review end-of-program assessment; a group
debate.
Simulated problem or situation
■ At various times during the course of the
Journal, log, diary program. Examples: a panel survey; a mix
Unobtrusive measures
of methods implemented at various times;
at three and six months.
■ Continuously throughout the program.
Multiple data collection methods. Using two Example: logs kept by youth throughout a
or more methods provides a more thorough weekend camping experience.
account and cross-validates your findings. This ■ Over time. Example: a longitudinal survey
may not be necessary if the evaluationÕs that documents practices over several years.
purpose is narrow, you have few resources or Will a sample be used?
you expect limited use of your findings. Whether to sample or not depends upon the
Instrumentation purpose of the evaluation, the size of the popu-
In most evaluations, there is some sort of form lation, and the method used. For example, it
or device for compiling information such as a may be possible to administer a post-test to all
recording sheet, a questionnaire, a video or 100 participants in a workshop to ascertain
audio tape. Think about the method you have their level of learning and satisfaction. But, if
chosen and decide what is needed to record you want interview data from these same par-
the information. If a questionnaire or recording ticipants, you may not be able to interview all
sheet is used, check to ensure that it 100. And, if you want to generalize to the
whole group, you will need a probability
■ will secure the information you want;
sample. Sampling also depends upon the
■ will be understood by the respondent and number of people in your program. If your
the recorder; program is a focused effort working with 20 at-
■ will be simple and easy to follow; risk teenagers, you will probably want to
■ will be culturally sensitive. include all 20 in your evaluation.
P L A N N I N G A N E V A L U A T I O N ■ ■ ■ 13

We tend to think of sampling in terms of Consider your respondents. Convenient


people. One can also take a sample of docu- times and places are likely to differ depending
ments, program sites or locations. For example, upon whether your respondents are farmers,
rather than collecting information from all the business owners, men, women, single parents,
county 4-H clubs, you may wish to focus your school teachers, or some other group.
resources and take a random sample of clubs. Likewise, there may be culturally appropriate
Or, you may want to stratify your sample by meeting times and locations.
age, club activity, or location. To do so will
A sample worksheet in Appendix B covers
require particular attention to your sample size
the ÒinformationÓ aspects of planning an
and selection. In other cases, you may wish to
evaluation.
learn about select groups without needing to
generalize to all the groups. Then, a nonproba-
bility sample is appropriate. Using the
Consider the kind of sample and size that will information
be most credible to those you want to pay Evaluation involves more than just collecting
attention to the findings. information. The information must be orga-
nized and presented in a way that permits
Note: Some professional evaluators argue that
people to understand it.
it is better to sample and use several data col-
lection techniques in an evaluation than to
How will the data be analyzed?
expend all your resources on collecting data
rganizing, tabulating and analyzing your
from the entire population using a single
instrument. Also, political concerns may need O data to permit meaningful interpretation
takes time and effortÑoften, more time and
to be considered. For example, political offi-
cials or legislators may only see the evaluation effort than youÕd expect. Factor this in when
as credible if it includes their district or a large you design your evaluation. If resources are
number of respondents. limited, you may want to structure your evalu-
ation to be smaller, rather than larger.
Who will collect the data?
The aim of data analysis is to synthesize infor-
You may be the only one collecting informa-
mation to make sense out of it. Different tech-
tion, but more and more frequently, others are
niques are appropriate depending upon
also involved in evaluationÑparticularly in
whether you have qualitative (narrative,
data collection. Training or support may be
natural language) or quantitative data
needed to help them do their job.
(numerical data).
What is the schedule for data
Consider such questions as:
collection?
■ When will the information be available? ■ How will responses be organized/
■ When can the information be conveniently tabulated? By hand? By computer?
collected? When will it be least disruptive? ■ Do you need separate tabulations from
■ Where will the information collection take different locations or groups?
place? ■ What, if any, statistical techniques will be
■ When will data collection start and end? used?
■ How will narrative data be analyzed?
■ Who will organize and analyze the
information?
14 ■ ■ ■ P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N

How will the information be What are the conclusions and


interpreted—by whom? recommendations?
Summarize the three to five main points that
nterpretation is the process of attaching
I meaning to the analyzed data. Too often we
analyze data but fail to take the next step to put
you feel are most important from the evalua-
tionÑthe points you really want to remember
and have other people remember. As appropri-
the results in context and draw conclusions. For
ate, provide recommendations that you feel
example, what does it mean that 45% of the
follow from these findings.
respondents reported having their wells tested?
Is this greater or less than last year? Is this What did we learn? What will we do
number high or a low for X county? What does differently?
it mean in terms of health and safety? What, if If we agree that the underlying purpose of any
anything, should be done next? evaluation is to promote understanding and
learning about extension programs, then the
Numbers do not speak for themselves. They
ultimate result is to articulate what we
need to be interpreted based on careful and
learnedÑabout the program, about our profes-
fair judgements. Similarly, narrative statements
sional competencies, about the process of the
need interpretation.
evaluation. What will we do as a result of these
Who should be involved in interpreting insights? Often, it is useful to lay out an action
the results of the data analysis? plan. When conducting the evaluation in col-
The same information can be interpreted in laboration with othersÑsuch as a community
various ways. As the program director, you group, a farmersÕ association or a 4-H clubÑ
may have your own perspective. Others will developing an action plan helps ensure the
look at the data through different eyes. Greater results are used.
understanding usually results when we
involve others or take time to hear how differ- How will the evaluation be
ent people interpret the same information. communicated and shared?
One way to do that is through meetings with
To whom?
small groups to discuss the data. Think about
Look back at who was identified early on as a
including program participants when dis-
key user. Target key decision makers with
cussing the meaning of the information.
appropriate and hard-hitting information.
What is the base for interpreting the Share with your colleagues who may need to
data? conduct a similar evaluation. Is there anyone
Consider how you will make sense of the else who might, or should be, interested in the
results. To what will the information be com- evaluation results?
pared: findings from other evaluations?
Remember to communicate your findings to
Baseline data? Initiating evidence of need?
the respondents who participated in your eval-
Pre-defined standards of expected perfor-
uation. Not only is this courteous, but it helps
manceÑÓwhat should beÓ?
to ensure their cooperation in future work.
Who sets the basis for comparison?
How?
■ Expert/professional judgements
You have expended time and resources in con-
■ Program personnel judgements ducting your evaluation. Now, you need to
■ ParticipantsÕ judgements maximize your investment in the project.
■ Existing research; experience from similar Think about other ways you might get some
programs mileage from your effort. Remember that citing
■ Your own personal judgement
a finding or two in informal conversations may
have more influence than a formal report.
P L A N N I N G A N E V A L U A T I O N ■ ■ ■ 15

Communication methods you use will depend Finalizing the plan


upon your audience. A variety of possibilities ■ Assess the feasibility of carrying out your
exist, such as: plan.
■ A written report ■ Do you foresee any barriers or obstacles?
■ Short summary statements ■ Refine and revise as necessary.
■ Film or videotape References
■ Pictures, photo essays, wall charts,
Bennett, C. and K. Rockwell. 1995. Targeting
bulletin boards, displays
Outcomes of Programs (TOP). USDA, draft
■ Slide-tape presentations document.
■ Graphs and visuals Brinkerhoff, R.O., D. Brethower, T. Hluchyj,
■ Media releases Jeri Nowakowski. 1983. Program Evaluation:
■ Internet postings A PractitionerÕs Guide for Trainers and
Invite your audiences to suggest ways theyÕd Educators: Sourcebook, Casebook. Boston:
like to receive the information; for example, Kluwer-Nijhoff.
dates when it would be most valuable; useful Clouthier, D., B Lilley, D. Phillips, B. Weber,
formats; effective displays or graphs; other rec- D. Sanderson. 1987. A Guide to Program
ommendations that would maximize its use. Evaluation and Reporting. University of
Maine Cooperative Extension Service.
Managing Froke, Barbara. 1980. The Answers to Program
Evaluation: A Workbook. Adapted by Spiegel
the evaluation and Leeds, August 1992. Ohio Cooperative
Implementing the plan: timeline Extension Service, Ohio State University.
and responsibilities Herman, Joan, Lynn Morris, Carol Fitz-Gibbon.
here are various ways to lay out responsibil-
T ities and a timeline for managing an evalua-
tion. Construct your own or see the examples
1987. EvaluatorÕs Handbook. Newbury Park:
Sage Publications.
Patton, M.Q. 1982. Practical evaluation.
in Appendices C and D. You may wish to post
Beverly Hills: Sage Publications.
the chart in Appendix C in an obvious location
where all involved can see it. Rockwell, S. Kay. 1993. Program Evaluation in
Adult Education and Training. Student
Budget Manual. Satellite program consisting of 15
If necessary, establish a budget to cover costs modules. U. of NebraskaÐLincoln and
such as printing or duplicating (data collection Agriculture Satellite Corporation.
instrument, reports), communications
(postage, telephone calls), incentives or Stecher, B.M. & W.A. Davis. 1987. How to Focus
rewards to respondents, providing a meal or an Evaluation. Newbury Park, CA: Sage.
other reimbursement for group participants, Worthen, B.R., and James R. Sanders. 1987.
making or editing a videotape, travel and per Educational Evaluation: Alternative
diem costs, data processing, or consultantsÕ fees. Approaches and Practical Guidelines.
New York: Longman.
Appendix A

Example showing indicators for different levels in an extension program.


Program level Expected acheivements Indicators
7. Impact Town makes development Need for zoning clarified.
decision based on good Town adopts comprehensive
planning techniques. plan. Effective moratorium on
condos. Citizen satisfaction.
6. Actions Individuals use skills or Planning board creates and proposes
knowledge; board works better comprehensive plan, functions
with other town departments; as a cohesive unit, schedules
new relationships are formed. sessions with decision makers;
new working relationships evolve.
5. Learning Acquire sufficient knowledge Community planning techniques
about planning and skills learned. Group functioning
with group process. Develop understood. Process noted
positive attitudes about informally among members
planning. Choose to act. and Extension staff.
4. Reactions Group members maintain level Progress is made, deadlines kept.
of interest and accept leadership Board takes initiative in planning
responsibilities. Extension process. Group is satisfied with
staff role is appropriate. progress and ExtensionÕs role.
3. Participation Appropriate people are Broad-based representation;
involved as members and as each member accepts part of the
technical resources. work; appropriate resource
people (technical and key
community leaders) take part.
2. Activities Needs assessment; Needs assessment completed.
facilitate prioritization Problems defined and written.
process; four meetings. Objectives and priorities set.
1. Inputs Volunteer/citizen Public or official support and
participation; 100 hours sanctions; agreement (contract) made
Extension time; specialist input. between group and Extension staff;
group membership established;
contract formed if necessary.

16
Appendix B
Method for collecting
Evaluation questions Information required Information source information
17
continued on next page

Sample worksheet developed by Rockwell, 1993. Module 8.3 based on worksheet for summarizing an evaluation plan from Worthen & Sanders, 1987 (p.244)
Information collection
arrangements Reporting of information
Interpretation procedures
By whom Conditions When Analysis procedures and criteria To whom How When

Appendix B,
Sample worksheet developed by Rockwell, 1993

continued
18
Appendix C, Evaluation schedule and input requirements
Dates Inputs needed
Activity/Task Person responsible Start Finish to accomplish task
19
Appendix D

The Gantt Chart Uses


1. Communicates evaluation plan to a non-
Gantt Chart is a simple display that includes
technical audience; therefore it is useful in
proportionate, chronologically scaled time-
proposals or reports.
frames for each evaluation task. The chart pro-
vides an overview of the entire evaluation 2. Helps in time management since it forces
process that illustrates when evaluation activi- one to examine the length of time each
ties will begin and how long each will con- project task will require, to contemplate
tinue. the overlap between tasks, and to establish
a realistic timeframe for the entire project.
Vertical axis lists the tasks to be completed.
A Gantt chart reflects the evaluatorÕs planning.
Horizontal axis shows a time scale.
Although it is easy to prepare, it is useful only
A horizontal line is drawn for each task to
when all evaluation steps are accounted for
show how long it will take.
within realistic time frames. One must allow
Milestones (important interim deadlines) are
sufficient time for each step in the evaluation
keyed with symbols.
processÑstarting from focusing the evaluation
through the final report.
Source: Rockwell, 1993. Module 8.9
based on Worthen and Sanders,
1987, pp. 256-257.

Figure 1. Example of a Gantt Chart

1. Develop instrument

2. Test instrument & revise Milestone when a


product is available
3. Select sample

4. Prepare for mailing

5. Data collection

6. Data analysis

7. Draft report

8. Final report

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

Time (weeks)

20
Authors: Ellen Taylor-Powell is a program development and evaluation specialist for Cooperative Extension,
University of WisconsinÐExtension. Sara Steele is a professor of continuing and vocational education at the
University of WisconsinÐMadison.
An EEO/Affirmative Action employer, University of WisconsinÐExtension provides equal opportunities in
employment and programming, including Title IX and ADA requirements. Requests for reasonable accommo-
dation for disabilities or limitations should be made prior to the date of the program or activity for which they
are needed. Publications are available in alternative formats upon request. Please make such requests as early as
possible by contacting your county Extension office so proper arrangements can be made.
© 1996 by the Board of Regents of the University of Wisconsin System doing business as the division of
Cooperative Extension of the University of WisconsinÐExtension. Send inquiries about copyright permission to:
Director, Cooperative Extension Publications, 201 Hiram Smith Hall, 1545 Observatory Dr., Madison, WI 53706.
This publication is available from:
Cooperative Extension Publications
Room 170, 630 W. Mifflin Street
Madison, WI 53703
Phone: (608)262-3346
G3658-1 Program Development and Evaluation, Planning a Program Evaluation
G3658-1W

Program Development
and Evaluation

Planning a Program Evaluation:


Worksheet
Ellen Taylor-Powell
Sara Steele
Mohammad Douglah

February 1996
■ ■ ■ P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N

Focusing an evaluation
1. What are you going to evaluate?

2. What is the purpose of the evaluation?

3. Who will use the evaluation ? How will they use it?

Who/users How will they use the information?

How may others be involved in the evaluation? __________________________________________

4. What questions will the evaluation seek to answer?

5. What information do you need to answer the questions?

What I wish to know Indicators—How I will know it?

6. When is the evaluation needed? ____________________________________________________________

7. What resources do you need?


a. Time available to work on evaluation: _____________________________________________
_______________________________________________________________________________
b. Money: _______________________________________________________________________
_______________________________________________________________________________
c. PeopleÑprofessional, paraprofessional, volunteers, participants: _____________________
_______________________________________________________________________________
P L A N N I N G A N E V A L U A T I O N — W O R K S H E E T ■ ■ ■

Collecting the information

8. What sources of information will you use?

Existing information: ____________________________________________________________


People: _________________________________________________________________________
Observations: ___________________________________________________________________
Pictorial records: ________________________________________________________________

9. What data collection method(s) will you use?

■ Survey ■ Document review

■ Interview ■ Testimonials

■ Observation ■ Expert panel

■ Group techniques ■ Simulated problems or situations

■ Case study ■ Journal, log, diary

■ Tests ■ Unobtrusive measures

■ Photos, videos ■ Other (list) ___________________________

Instrumentation: What is needed to record the information?

10. What data collection procedures will be used?


When will you collect data for each method youÕve chosen?

During Immediately
Method Before program program after Later

Will a sample be used?________________________________________________________________


No ■
Yes ■ If yes, describe the procedure you will use. ___________________________________
_________________________________________________________________________________
_________________________________________________________________________________
Who will collect the data? _____________________________________________________________
■ ■ ■ P R O G R A M D E V E L O P M E N T A N D E V A L U A T I O N

Using the information

11. How will the data be analyzed?

Data analysis methods: ________________________________________________________________

Who is responsible: ___________________________________________________________________

12. How will the information be interpreted—by whom?

Who will do the summary?_____________________________________________________________

13. How will the evaluation be communicated and shared?

To whom When/where/how to present

Managing the evaluation


14. Implementation plan: timeline and responsibilities

Management chart_________________________________________________________________
Budget ___________________________________________________________________________

University of WisconsinÐExtension ■ Cooperative Extension


630 W. Mifflin Street, Room 170, Madison, WI 53703, Phone: 608/262-2169, fax: 608/262-9166
February 1996

You might also like