Program Evaluation Plan

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 22

PROGRAM ANALYSIS

Edward Williams
Walden University
Tutor: Dr. Michael Burke
Instructional Design (EDUC-6130-1)

May 22, 2016

The name and type of organization that runs the program


The program I choose to evaluate is an Action Plan that was implemented by the ministry of
Education, in 2014. Because of poor performance in Mathematics and Language, Caribbean
Leaders and the OECS Education Reform Unit reached out to countries to develop an Action
Plan to remedy the situation nationwide. The plan is stretched over a three-year period and
continuing.
A description of the programits goals/objectives, operations, outcomes, performance
history
The Action Plan was developed based on school assessment data which revealed very
unsatisfactory student performance in Math and English. Although training was conducted with
principals and teachers, there was very little improvement - it is on this premise and the steady
decline in students performances that lead to the formulation of the Action Plan. The goal is to
ensure effective teaching of mathematics and Language throughout the education system. The
outcome of this plan is to raise the performance level of students in mathematics and language,
while charting the way forward for the educational sector.
The Action Plan was developed and implemented in 2014. To date there are some successes as
well as some challenges.
A brief history of the programhow it started, how it is currently perceived
The program was developed and introduced by the Ministry of Education in Grenada. The
Minister together with the Chief Education Officer presented the plan to principals. The
principals was then charged with the task to sell or present to their staff in the hope of them
actually implementing the plan. Initially, principals and teachers did not really endorse the plan

because they felt left out due to the fact that they were not involved in the planning process.
Now, their perception has changed since they are more involved in the implementation process.
The stakeholders involved in the program and their interests. Explain how you arrived at
these conclusions
The stakeholders involved are Caribbean Leaders, OECS Education Reform Unit, who
recognized and reached out to countries to develop an Action Plan to remedy the situation. The
Ministry of Education together with its team (Minister, Chief Education Officer (CEO),
Permanent Secretary, Curriculum Officers, Exam unit, District Education Officers) developed
and implemented a plan that would directly address the needs. The Principal together with the
district education offers are to monitor and supervise teachers as the plan unfolds.
The parents and community are the support pillars for the students to encourage them to
participate in their learning, both in and out of the school building. The medias role is to help
sensitize the public of the situation and to help them understand why it is important for them to
participate in the process. I arrived at these conclusions based on the roles and responsibilities
highlighted within the program.
The contextual factors that impact the program, including the political environment and
other interpersonal dynamics within the organization
Some of the contextual factors impacting the program are quality of instruction, the level of
community support and the framework teachers and principals has to develop for students to
learn mathematics. It was left to individual schools to develop their own framework to address
the issue with little support from the Ministry of Education.

Potential ethical challenges involved in an evaluation of this program


Some of the potential ethical challenges involved in this program are:

The Minister has already boast of major success of the program even before it is
completed.

Key stakeholders such as teachers and principals were left out of the planning process

Teachers and principals felt that the program was not their because it was imposed on
them despite its good intentions

Stakeholders such as principals are fearful of presenting true findings of the program
because it may show up their school as being a failure.

There was no clear formal agreement with the various stakeholders

Poor Communication among various stakeholders

EVALUATIVE CRITERIA AND MODEL

Edward Williams
Walden University
Tutor: Dr. Michael Burke
Instructional Design (EDUC-6130-1)

June 5, 2016

Proposed Evaluation Questions


1. Do you think that you were adequately prepared to implement the program?
2. To what extent are teachers receiving the required training, support and supervision?
3. Did the students and teachers change their (attitude, behavior, or performance) after
program completion?
4. Did students understanding and performance in mathematics and Language Arts increase
after program completion?
5. Which teaching methods do you think made the greatest impact as it relates to teaching
and learning?
Rationale
These questions focus on the implementation and impact process because the program
was implemented to help students improve their performance in mathematics and Language Arts.
Whether the program is well implemented or not, this will directly affect the impact/outcome of
the program. Getting an understanding of how the program is implemented and the factors that
may affect the outcome in one way or another would give an insight as to why the program
succeeded or failed. The quality of the training of the teachers was not assessed because it is
expected that the training was conducted by experts and most teachers would have some prior
years of experience and knowledge about the subject areas. The media sensation also was not
assessed due to the factor that the Minister of Education would use his influence to publicize and
promote the program.
Stakeholders Involvement

There are various stakeholders involved in the Action Plan such as curriculum officers,
education officers, teachers, principals, parents, students, examination unit and district education
officers. At least one person from the various groups of stakeholders including the evaluator
should be involved in determining evaluation questions because each group has a primary
interest in the program. In addition, each person would bring their perspective, experience and
areas of expertise to give deeper understanding or insight in the evaluation of the program. The
sponsor of the evaluation, key audience, and individuals or groups who will be affected by the
evaluation should all have a voice, (Fitzpatrick, Sanders, & Worthen, 2011, p.329).
Stakeholders can work along with the evaluator in proposing, developing and editing evaluative
criteria. For example, through negotiation the Minister can add or subtract selected questions,
education officers can either increase or reduce the scope of the study while at the same time the
evaluator may stand and defend his/her perspective. The process can be challenging but at the
end it gives stakeholders a sense of ownership thus making it easier to adopt and utilize the
findings/outcomes of the evaluation.

Evaluation Model
EXPERTISE AND CONSUMERORIENTED APPROACHES

Advantages
Expertise-oriented is applicable to a wide
cross-section of evaluation. Since this
model depends on professional expertise
and decisions are made by those who are
experienced and who are fully informed,
set standards, improvement can be
encouraged through self-study (Ayers,
2014)

Consumer-oriented focuses on consumer


information needs, concerns with cost
effectiveness (Fitzpatrick, Sanders, &
Worthen, 2011). Ayers (2014) highlights
that information of value when given to
persons who has limited time to study can
improve the knowledge of consumers of
suitable criteria when selecting
programs/products.
The link between the program creators
PROGRAM-ORIENTED EVALUATION and the research literature is encouraged;
APPROACHES
It assist in explaining the outcomes of the
porgram, it eludes the black box
(unknown about outcomes); explanation

Disadvantages
Expertise-oriented can be bias because
evaluation lies in the hand of the expert.
Reliability; replicability; scarcity of
supporting documentation to support
conclusion and open to conflict of interest,
(Fitzpatrick, Sanders, & Worthen, 2011).

Consumer-oriented can increase product


cost, rigorous testing can crimp
originality, less local initiative because of
reliance on outside consumer services,
(Ayers, 2014)
Fitzpatrick, Sanders, & Worthen, (2011)
also said that it lack sponsor or funders;
not open to debate or cross-examination.

Maybe there is not enough concern on the


stakeholders, but, too much on the
research which can lead to possible
overemphasis on the results. (Fitzpatrick,
Sanders, & Worthen, 2011).

DECISION-ORIENTED EVALUATION
APPROACHES

of program outcomes is emphasized,


(Fitzpatrick, Sanders, & Worthen, 2011).
Sincekeydecisionmakersareinvolved
intheprocess, evaluationresultsand
recommendations aremorelikelytobe
used.

Keystakeholdergroups (decision
makersand endusers) are given indepth
consideration.
Because all decisionmaking approaches
areutilityorientedtoacertaindegree,
theutilitystandard alignswellwith the
JointCommittee standards.
Duringtheevaluation the evaluatoris
mainlyincontrol.

Similar to the objective-oriented


approach, unintendedoutcomesmight
gounnoticed.
Theevaluation process lacksinputfrom
a broad cross section of
stakeholdergroups, more of them should
be consideredin the process.
Amorenarrowgroupofstakeholders
involvedintheevaluation
process(thesewouldbethekey
decisionmakersand/orkeyendusers)
Mainly the top decision-makers and
management are given partiality and
power.
Asaresult, when the evaluation process
is being conducted the stakeholderswith
less powermightnotbeconsidered.
There might be a faster turnoverof
primarydecisionmakersandintended
endusersoftheevaluationresults
mightoccur.
Assumesdecisionsthat canbemade or
determined inadvancehowever, at
timesthisis not thecaseandthe
decisions,needs,orproblemsemerge

PARTICIPANT-ORIENTED
EVALUATION APPROACHES

Human element is emphasized, new


insights and theories are gained,
flexibility, attention to contextual
variables, multiple data collection
methods are encouraged, rich, persuasive
information is encouraged, establishes
dialogue with and empowers quiet,
powerless stakeholders

fromaneedsassessmentorduringthe
evaluationprocess
It is too complex for practitioners (more
for theorists), political element,
subjective, loose evaluations, labor
intensive which limits number of cases
studied, cost, objectivity can be lost by the
potential evaluators.

Explain your choice of model for your program evaluation:


The choice of model I will use for my program evaluation is the decision oriented approach. Decision-oriented evaluation model is
used when there is a problem within an organization and a decision has to be made. The quality of the teaching and learning of
Mathematics and Language Arts need improving, and since the principals (instructional leaders) and teachers are within the
institution, they can use the information collected to make more informed decisions.
This model is superior to the other modules in my case study because, the Context, Input, Process, and Product evaluation model
(CIPP) would assist in identifying teaching and learning needs. Zhang et al. (2011), said it is evident that the CIPP evaluation model
has unique features that can help effectively address educational programs. They went on to say that these unique features include
context evaluation, ongoing process evaluation, and the models emphasis on engaging participants in the evaluation process. This
model can help provide principals and teachers with a strong voice in planning, implementing, and evaluating teaching-learning

experiences; engage participants in an ongoing process to assess the quality of implementation and progress toward meeting
specified goals; and use evaluation results for improvement and sustainability, (p. 68).

Teachers are considered to be experts in their own rights but the expertise-oriented approach would not do justice in evaluating the
Action Plan because the focus is not mainly on quality but rather delivery and implementation. Since consumer-oriented focus is
on the consumer gaining knowledge to make a final decision and this is not the case with the students.

How I Intend to use it.


The context evaluation stage would help identify the needs/problems that exists and opportunities for addressing it. I will interview
principals and teachers and other stakeholders closely associated with the program.

The input component gives suggested strategies that can address the problems/needs identified. At this stage I will review relevant
literature available, consult specialist/experts.

The process component monitor the process, identify areas of adjustments and make modification thus leading to improvement.
Identified the activities that should be monitored; review teachers records, give feedback/debriefing with classroom teachers and
principals; interview students.

The product component interprets the information collected, their merit worth and significance. It also measures and judges project
outcomes. Review pre and posttests/assessments records and interview stakeholders

Reference

Ayers, S. (2014). Alternative approaches to evaluation II. Retrieved


from http://webcache.googleusercontent.com/search?q=cache:frUeETe8OT0J:homepages
.wmich.edu/~sayers/6440%2520ch%252035%2520alternative%2520approaches.ppt+&cd=10&hl=en&ct=clnk
Stufflebeam, D. L. (2005). CIPP model (context, input, process, product) In S. Mathison
(Ed.), Encyclopedia of evaluation. Thousand Oaks, CA: Sage.
Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2011). Program evaluation: Alternative
approaches and practical guidelines (4th Ed.). Upper Saddle River, NJ: Pearson
Education, Inc.
Zhang, G., Zeller, N., Griffith, R., Metcalf, D., Williams, J., Shea, C., & Misulis, K. (2011).
Using the context, input, process, and product evaluation model (CIPP) as a
comprehensive framework to guide the planning, implementation, and assessment of
service-learning programs. Journal of Higher Education Outreach and Engagement, 15,
4.

REPORTING STRATEGY
Edward Williams
Walden University
Tutor: Dr. Michael Burke
Instructional Design (EDUC-6130-1)
June 19, 2016

Reporting Strategy

Once you have obtained your evaluation results, you should share the findings with your key
audiences. You may have several intended audiences, each with different interests and
preferences regarding the report, (Holm-Hansen, 2007). Below is a detailed outline of the
reporting strategy of an evaluation of the program Action Plan implemented in all primary
schools within District 6 in Grenada.

Stakeholder

Reporting Strategy

Implications

Stakeholder Involvement

Minister of Education and

- Email

- The final evaluation report

To gain support

Education Officers ( Exam

- Posting of final report on

presents the full view of the

To review evaluation progress

Unit, Curriculum Officers,

Website

evaluation. It serves as the basis

To learn and improve

Permanent secretary, District

- Debriefing Meetings

for the executive summary, oral

To promote dialogue and

Education Officers, Chief

Teleconference/Web

presentations, and other

understanding among partners

Education Officer)

conference

reporting formats, and is an

To help develop

- Interim or progress reports

important resource for the

recommendations

program archives.

To ensure use of the

- While Interim or progress

recommendations

reports can be critical to making


an evaluation more useful, they
can also raise issues if
interpreted incorrectly.
- E-mails help maintain ongoing
communication among
evaluation stakeholders using
brief messages

Principals

- Workshop ( PowerPoints)

- Executive summary

To build awareness

- Executive summary

summarizes, or reviews the

To gain support

- After Action Reviews

main points of a larger

To learn and improve

- Emails

document or report for readers

To promote dialogue and

who does not have the time to

understanding among partners

read the entire report.

To help develop

- After action review, provide a

recommendations

structured review or de-brief

To ensure use of the

process for

recommendations

analyzing what happened, why it


happened, and how it can be
done better by the participants
and those responsible for the
project
Parents and community

Radio and Television

- These formats can invite

To raise awareness

interviews,

feedback, provide updates,

To gain support

PTA meetings

report upcoming evaluation

To promote dialogue and

Executive summary

events, or present preliminary or understanding among partners


final findings

Students

Video presentations

- The video can communicate

To raise awareness

complex ideas with clarity,

To gain support

precision, and efficiency. Often

To learn and improve

the most effective way to

To help develop

describe, explore, and

recommendations

summarize a set of numbers is


to look at pictures of those
numbers (Tufte 1989).
Teachers

Executive summary

- Storyboards/Learning stories

To build awareness

Workshop

narrate cases of unanticipated

To gain support

Storyboards

project difficulties or negative

To learn and improve

PowerPoints

impacts, how these were

To promote dialogue and

Personal meetings

identified and overcome, and

understanding among partners

After Action Reviews

what was learned that may be

To help develop

helpful in the future or to others

recommendations

(De Ruiter and Aker 2008)


- Workshop provides a way to
create an intensive educational
experience in a short amount of
time and is a way for someone
to pass on to colleagues ideas
and methods that he has
developed or finds important.

OECS Education Reform Unit

Posting of final report on

- Debriefing meeting to be held

To gain support

Caribbean Leaders

Website

through teleconference/web

To review evaluation progress

conference to present findings,

Teleconference/Web

recommendations, and intended

To assess the likelihood of

conference

actions

future support

Email

- E-mails help maintain ongoing


communication among
evaluation stakeholders using
brief messages

Media

Press Conference

- Serve to highlight, evaluate

Sensitize the public

Executive summary

information, help to generate

To gain support

interest in the full evaluation


findings, and serve an
organizations public relation
purposes
- Using the news media helps
the project reach a large
audience, such as the general
public or a specific professional
group

Values, Standards, and Criteria: it is the critical and ethical responsibility of the evaluator to present an accurate, balanced and
fair report. As an evaluator one needs to be mindful of personal bias and should do all that is necessary to mitigate any biasness in
reporting. The American Evaluation Associations Guiding Principles provides guidance in helping evaluators to provide quality

reports. The Joint Committee Standards require utility, meaning the results or conceptual uses of the findings and it might not occur
until long after the study is completed and require utility, meaning the results must have the potential to be used, (Fitzpatrick,
Sanders, & Worthen, 2011, p. 486).

Potential ethical issues: it is important to protect the rights and dignity of the evaluation participants and it should be incorporated
into the way that you design and carry out your project. It is also important to consider safeguards that may be needed when your
participants are children, (Holm-Hansen, 2007). Some potential ethical issues are:
-

Consideration of risks and benefits evaluators should carefully consider any negative result that may result from an
evaluation, and take steps to reduce it, for example, keep evaluation procedures as brief and convenient as possible to
minimize disruptions in participants daily routine

Informed consent of students to participate

Confidentiality

Reference
De Ruiter, F., and J. Aker. 2008. Human Interest Stories: Guidelines and Tools for Effective
Report Writing. American Red Cross/CRS M&E Module Series. American Red Cross
and Catholic Relief Services (CRS), Washington, DC and Baltimore, Maryland.
Holm-Hansen, C. (2007). Ethical issues. Retrieved from https://www.wilder.org/WilderResearch/Publications/Studies/Program%20Evaluation%20and%20Research%20Tips/Et
hical%20Issues%20%20Tips%20for%20Conducting%20Program%20Evaluation%20Issue%2012,%20Fact%
20Sheet.pdf
Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2011). Program evaluation: Alternative
approaches and practical guidelines (4th Ed.). Upper Saddle River, NJ: Pearson
Education, Inc.
Tufte, E. R. 1989. The Visual Display of Quantitative Information. Cheshire, Connecticut:
Graphics Press.

You might also like