Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 33

Love In Action Ethiopia

M and E Guideline

August 2019
Addis Ababa

Table of Contents
1. Introduction............................................................................................
2. Program Description and Framework....................................................
3. Detailed Description Of Plan Indicators................................................
4. Indicator Target Table.....…………………………………………………………….

5. Data Collection Plan...............................................................................

6. Deliverable Schedule…………………………………………………………….........

7. Data Quality Management Plan………………………………………………….....

8. Data Analysis and Reporting Plan…………………………………………………

9. Plan For Monitoring................................................................................

10. Plan For Evaluation...............................................................................

11. Plan For Utailization Of Information....................................................

12. Mechanism for Updating the Plan………………………………………………..

13. M&E Standards …………………………………………………………………………

14. Annex of M&E Tools …………………………………………………………………

I. INTRODUCTION
The introduction to the M&E plan should include: information about the purpose of the
program, the specific M&E activities that are needed and why they are important; and a
development history that provides information about the motivations of the internal and
external stakeholders and the extent of their interest, commitment and participation.

Project intervention should have monitoring and evaluation (M&E) plan. This is the
fundamental document that details a program’s objectives, the interventions developed to
achieve these objectives and describes the procedures that will be implemented to determine
whether or not the objectives are met. It shows how the expected results of a program relate
to its goals and objectives, describes the data needed and how these data will be collected and
analyzed, how this information will be used, the resources that will be needed, and how the
program will be accountable to stakeholders. M&E plans should be created during the design
phase of a program and can be organized in a variety of ways. Typically, they include: the
underlying assumptions on which the achievement of program goals depend; the anticipated
relationships between activities, outputs, and outcomes; well-defined conceptual measures and
definitions, along with baseline values; the monitoring schedule; a list of data sources to be
used; cost estimates for the M&E activities; a list of the partnerships and collaborations that will
help achieve the desired results; and a plan for the dissemination and utilization of the
information gained.

State how a program will measure its achievements and therefore provide accountability;
document consensus and provide transparency; guide the implementation of M&E activities in
a standardized and coordinated way; and preserve institutional memory.

II. PROGRAM DESCRIPTION AND FRAMEWORK


The program description should include: a problem statement that identifies the specific
problem to be addressed. This concise statement provides information about the situation that
needs changing, who it affects, its causes, its magnitude and its impact on society; the program
goal and objectives: a program’s goal is a broad statement about the desired long-term
outcome of the program.
Objectives are statements of desired specific and measurable program results. Descriptions of
the specific interventions to be implemented and their duration, geographic scope and target
population; the list of resources needed, including financial, human, and those related to the
infrastructure (office space, equipment and supplies); the conceptual framework, which is a
graphical depiction of the factors thought to influence the problem of interest and how these
factors relate to each other; and the logical framework or results framework that links the goal
and objectives to the interventions.
Objectives:

Specific: Is the desired outcome clearly specified?


Measurable: Can the achievement of the objective be quantified and measured?
Appropriate: Is the objective appropriately related to the program’s goal?
Realistic: Can the objective realistically be achieved with the available resources?
Timely: In what time period will the objective be achieved?
FRAMEWORK
Frameworks are key elements of M&E plans that depict the components of a project and the
sequence of steps needed to achieve the desired outcomes. They help to increase
understanding of the program’s goals and objectives, define the relationships between factors
key to implementation, and delineate the internal and external elements that could affect its
success. They are crucial for understanding and analyzing how a program is supposed to work.
There is no one perfect framework and no single framework is appropriate for all situations, but
three common types will be discussed here: conceptual framework, results framework and logic
model.
RESULT FRAMEWORK
LIA-E will use result framework and log framework since all program that needs intervention by
the organization are expected to show, progress result as an output or outcome while program
intervention is conducted. Results frameworks, diagram the direct causal relationships between
the incremental results of the key activities all the way up to the overall objective and goal of
the intervention. This clarifies the points in an intervention at which results can be monitored
and evaluated. Results frameworks include an overall goal, a strategic objective (SO) and
intermediate results (IRs). An SO is an outcome that is the most ambitious result that can be
achieved and for which the organization is willing to be held responsible. An IR is a discrete
result or outcome that is necessary to achieve an SO. Before achieving goal and strategic
objective, a set of “lower level” intermediate results must first be reached. Under each IR are
subordinate intermediate results, or sub-IRs that relate directly to the intermediate results.
Love in Action Ethiopia/LIA-E/ will follow USAID country Development Cooperation Strategy for
most of USAID funded program to mirror what is available in the country performance
management plan (PMP) with respect to the mission development objective that matches with
organizational result/logical framework to implement each project under USAID funded. Below
is typical diagram of result framework that the organization follow.
Project Goal: (to be taken
from the country Mission
DO)

Intermediate result (IR) Intermediate result (IR)


corresponding to the mission’s IR corresponding to the mission’s IR
with respect to your particular DO with respect to your particular DO
which will be your which will be your outcome/purpose
outcome/purpose

Sub-IR (S-IR) from the Sub-IR from the country Sub-IR from the country Sub-IR related to the
country S-IR which will S-IR which will be your S-IR which will be your country S-IR which will be
be your sub-outcome/sub-purpose sub-outcome/sub-purpose your sub-outcome/sub-
sub-outcome/sub-purpose in the LF in the LF purpose in the LF
in the LF

Key inputs required to undertake activities; should mirror the inputs as they appear in the logical
framework

LOGICAL FRAMEWORK
Logical framework is a methodology for analyzing the context of a proposed project
intervention, formulating objectives and defining criteria that can be used to measure its
impact. Its Analysis starts out from the idea that the actions of the project will produce,
changes that can be predicted, observed and measured at least to a certain extent . The
organization will also follow this type of framework in case of need based project nature.

Narrative Summary Indicators Data Sources Assumptions


Goal: Measurement of goal Sources of information; Assumptions affecting Purpose -
achievement Methods use Goal linkage
Outcomes/Purposes: End of project status Sources of information; Assumptions affecting sub
Methods use outcome- purpose linkage
Sub-outcome/purpose (if In middle project status Sources of information; Assumptions affecting output –
applicable) Methods use sub-outcome linkage

Outputs: Magnitude of outputs; Sources of information; Assumptions affecting input -


Planned completion date Methods use output linkage

Inputs: Inputs (Nature and level of Sources of information; Initial assumptions about the
resources necessary ‘cost’ Methods use project
Planned starting date)
III. INDICATORS
Indicators are clues, signs or markers that measure one aspect of a program and show how
close a program is to its desired path and outcomes. They are used to provide benchmarks for
demonstrating the achievements of a program. One of the most critical steps in designing an
M&E system is selecting appropriate indicators. The M&E plan should include descriptions of
the indicators that will be used to monitor program implementation and achievement of the
goals and objectives.

An indicator is a variable that measures one aspect of a program or project that is directly
related to the program’s objectives. Let’s take a moment to go over each piece of this
definition. An indicator is a variable whose value changes from the baseline level at the time
the program began to a new value after the program and its activities have made their impact
felt. At that point, the variable, or indicator, is calculated again. Secondly, an indicator is a
measurement. It measures the value of the change in meaningful units that can be compared to
past and future units. This is usually expressed as a percentage or a number. Finally, an
indicator focuses on a single aspect of a program or project. This aspect may be an input, an
output or an overarching objective, but it should be narrowly defined in a way that captures
this one aspect as precisely as possible. A reasonable guideline recommends one or two
indicators per result, at least one indicator for each activity.

Quantitative and Qualitative Indicators

Indicators can be either be quantitative or qualitative. Quantitative indicators are numeric and
are presented as numbers or percentages. Qualitative indicators are descriptive observations
and can be used to supplement the numbers and percentages provided by quantitative
indicators. They complement quantitative indicators by adding a richness of information about
the context in which the program has been operating. Examples include “availability of a clear,
strategic organizational mission statement” and “existence of a multi-year procurement plan
for each product offered.”
IV. INDICATOR TARGET TABLE
S/ Baselin
Indicator Target Purpose, Calculation and Data Source
N e
Percentage of Purpose: It’s to track percentage of eligible
children who was enrolled to school.
eligible Calculation: This data includes children’s that
children who Numbe are enrolled for school services
r or Number
enrolled to Data Source: Data sources are school
N/A
registration and other educational institution
school that trace eligibility of children for school
enrollment.
1
LIA-E will take base line data source by
disaggregation concerning to this specific
Males : # #
Disaggregatio indicator. This means that the relative
Female Mal Femal
n by Sex (M/F) number of males (boys) and females (girls)
s: es es
enrolled to school. If not, use the ratio from a
related school facility.
Disaggregatio Numbe Its number of children in the stated age range
n by age (0- r or Number in the baseline and target columns. Adults
18) N/A are not to be included in this disaggregation.
Purpose: Its purpose is to identify percentage
Percentage of of girls student who were attending school
Calculation: Students record or report who
eligible girls
Numbe are attending the school from all students
attending r or Number that were at school.
N/A Data Source: This data may include school
school
attendance sheet information or other
relative document source used for
attendance purpose electronically/manually.
2
LIA-E will take base line data source by
# # disaggregating to male and female specific
Males : Femal
Disaggregatio Mal sex category. This means that the relative
Female es
n by sex(M/F) es number of males (boys) and females (girls)
s:
attending school. If not, use the ratio from a
related school facility.
Disaggregatio Numbe Number of children in stated age range in the
n by age (0- r or Number baseline and target columns. Adults are not
18) N/A to be included in this disaggregation.
3 Number of Numbe Number Purpose: To track number of students that
Schools r or are drop out from school in case of different
dropout rate N/A reason.
Calculation: This is an incremental/decrement
of drop out students number during the
semester or the academic year.
Data Source: This data may include students
roster who is not completing the academic
year or semester and also school attendance

# LIA-E will take base line data source by


#
Males : Femal disaggregating to specific sex category. This
Disaggregatio Mal
Female es means that the relative number of males
n by Sex (M/F) es
s: (boys) and females (girls) dropping from
school.
Disaggregatio Numbe Number of children in stated age range in the
n by age (0- r or Number baseline and target columns. Adults are not
18) N/A to be included in this disaggregation.
Purpose: To track number of youth who will
Number of attend skill based training at specific project
youths implementation
Numbe Calculation: Number youth students expected
attended skill r or Number to take skill based training during specific
based N/A project implementation
Data Source: This data source may include
training training profile and other reporting format to
4 be delivered based on the need.
# LIA-E will take base line data source by
#
Males : Femal disaggregating to male and female. This
Disaggregatio Mal
Female es means that the relative number of males
n by Sex (M/F) es
s: (boys) and females (girls) attending skill based
training.
Disaggregatio Numbe Number of children in stated age range in the
n by age (0- r or Number baseline and target columns. Adults are not
18) N/A to be included in this disaggregation.
5 Purpose: To track number of volunteer
engaged in peer session during project
Number of
implementation.
volunteer
Numbe Calculation: Number of potential volunteers
engaged in
r or Number engaged in peer session to contribute for
peer session
N/A project activities achievement.
implementati
Data Source: This data source may include
on
profile of volunteers or report organized
during the quarter on peer session education.
Disaggregatio Males : # # LIA-E will take base line data source by
n by Sex (M/F) Female Mal Femal disaggregating to male and female. This
s: es es means that the relative number of males
(boys) and females (girls) engaged in peer
session education.
Numbe Number of Adult in stated age range in the
Disaggregatio
r or Number baseline and target columns. Children’s are
n by age (>18)
N/A not to be included in this disaggregation.
Number of Purpose: To track number of school provide
service per standard implementation
schools guideline at project sites.
provide Numbe Calculation: Number of school that provide
6 r or Number services per standard implementation
service per N/A guidelines
standard Data Source: This data source may include
service provider profiles or report organized
during the quarter on service provision.
# LIA-E will take base line data source by
#
Males : Femal disaggregating to male and female. This
Disaggregatio Mal
Female es means that the relative number of males
n by Sex (M/F) es
s: (boys) and females (girls) that get services
from school service provider.
Disaggregatio Numbe Number of children in stated age range in the
n by age (0- r or Number baseline and target columns. Adult are not to
18) N/A be included in this disaggregation.
Number of Purpose: To track number of schools provide
ongoing skill improvement to targeted school
schools
children
provide Numbe Calculation: Number of school that provide
r or Number ongoing skill improvement training to
ongoing skill
N/A targeted school children.
improvement Data Source: This data may include profile of
school service provider and reports during
training
7 project implementation period.
# LIA-E will take base line data source by
#
Males : Femal disaggregating to male and female. This
Disaggregatio Mal
Female es means that the relative number of males
n by Sex (M/F) es
s: (boys) and females (girls) that access service
on skill improvement training.
Numbe Number of children in stated age range in the
Disaggregatio
r or Number baseline and target columns. Above 14 age
n by age (>14)
N/A children are included in this disaggregation.
Purpose: To track number of
Number of students/guardian reached with economic
needy support need
Numbe
students Calculation: Number of student that access
8 r or Number
reached with economic support need
N/A
economic Data Source: This data may include need
support assessment and related economic support
profile/ report
# LIA-E will take base line data source by
#
Males : Femal disaggregating to male and female. This
Disaggregatio Mal
Female es means that the relative number of males
n by Sex (M/F) es
s: (boys) and females (girls) supported with
economic needy packages.
Number of children/guardian in stated age
Numbe
Disaggregatio range in the baseline and target columns.
r or Number
n by age (>14) Under 14 age children are access the support
N/A
through their guardian
Number of Purpose: To track number of targeted
individuals that receive HIV testing services
targeted
and test result
individuals Calculation: Number of targeted individuals
that access HIV testing service and received
who received Numbe
test result during project implementation
9 r or Number
HIV testing period.
N/A
Data Source: This data may include HIV
services and
registration book and other referral form that
received there confirm the service accessibility including the
reporting tools during project
test result
implementation period.
# LIA-E will take base line data source by
#
Males : Femal disaggregating to male and female. This
Disaggregatio Mal
Female es means that the relative number of males
n by Sex (M/F) es
s: (boys) and females (girls) that access HIV
testing services and received test result.
Numbe Number of targeted individual in the baseline
Disaggregatio
r or Number and target columns. Individual targeted
n by age (>0)
N/A population above zero year
10 Number of Numbe Number Purpose: To track number of targeted
r or individual who can properly mention modes
targeted
N/A of HIV transition and prevention activities.
individuals Calculation: Number of targeted individual
who can properly mention modes of HIV
who can
transition and prevention activities as a result
properly of program intervention.
Data Source: This data may include behavioral
mention
change and awareness concern tracing tools
Modes of HIV that will be reported in reporting period.
transition and
Prevention
activities
LIA-E will take base line data source by
# # disaggregating to male and female. This
Males : Femal
Disaggregatio Mal means that the relative number of males
Female es
n by Sex (M/F) es (boys) and females (girls) that properly
s:
understand modes of HIV transition and its
prevention activities.
Numbe Number of targeted individual in the baseline
Disaggregatio
r or Number and target columns. Individual targeted
n by age (>18)
N/A population above 18 year
Number of Purpose: To track number of households with
insect side treated bed net
households
Calculation: Number of households with
with insect Numbe insect side treated and access bed net use.
11 r or Number
side treated Data Source: This data may include bed net
N/A
distributed for care with traceable tools and
bed net report generated during reporting period for
insect side treated bed net use.
LIA-E will take base line data source by
# # disaggregating to male and female. This
Males : Femal
Disaggregatio Mal means that the relative number of males
Female es
n by Sex (M/F) es (boys) and females (girls) that have access
s:
bed net use and treated for insect side
infection.
Numbe Number of targeted individual in the baseline
Disaggregatio
r or Number and target columns. Individual that access the
n by age (>0)
N/A service above zero year
12 Number of Purpose: To track number of individuals
reached with HIV/AIDS and Malaria education
individuals
training
reached with Calculation: Number of individuals reached
Numbe with HIV/AIDS and Malaria education training
HIV/AIDS and r or Number during program intervention
Malaria N/A
Data Source: This data may include training
education profile on HIV/AIDS and Malaria education,
report generated during implementation
Training period and other reporting format
Disaggregatio Males : # # LIA-E will take base line data source by
n by Sex (M/F) Female Mal Femal disaggregating to male and female. This
s: es es means that the relative number of males
(boys) and females (girls) attending HIV/AIDS
and Malaria education training.
Number of targeted individual in the baseline
Numbe
Disaggregatio and target columns. Individual that will be
r or Number
n by age (>18) trained on HIV/AIDS and Malaria education
N/A
above 18 year
Purpose: To track number of IEC/BCC
Number of distributed
IEC/BCC Numbe Calculation: Number of IEC/BCC distributed
r or Number over the reporting period.
distributed N/A Data Source: This data may include BCC
report and other source document related to
BCC format.
13 #
# LIA-E will take base line data source by
Males : Femal
Disaggregatio Mal disaggregating to male and female. This
Female es
n by Sex (M/F) es means that the relative number of males
s:
(boys) and females (girls) that will get IEC/BCC
Numbe Number of targeted individual in the baseline
Disaggregatio
r or Number and target columns. Individual that will be
n by age (>16)
N/A educated on IEC/BCC
Percentage of Purpose: To track the percentage of priority
population targeted by CHCT who receive HIV
priority
testing service and receive there result
population Calculation: percentage of priority population
targeted by CHCT who receive HIV testing
targeted by
Numbe service and receive there result
14 CHCT who r or Number
receive HIV N/A
Data Source: This data may include HIV
testing service registration form/book and other related HIV
reporting form and referral slip for HIV/AIDS
and receive testing services
there result

# LIA-E will take base line data source by


#
Males : Femal disaggregating to male and female. This
Disaggregatio Mal
Female es means that the relative number of males
n by Sex (M/F) es
s: (boys) and females (girls) that got HIV testing
service and receive there result
Numbe Number of targeted individual in the baseline
Disaggregatio
r or Number and target columns. Priority population that
n by age (>0)
N/A will access HIV testing services
15 Number of Numbe Number Purpose: To track number of condoms
r or distributed over the project period.
condoms
N/A Calculation: Number of condoms distributed
distributed over the implementation period.
Data Source: This data may include condom
distribution form, condom registration book
and other reporting formats that show the
distribution of condoms in a regular reporting
period.
# LIA-E will take base line data source by
#
Males : Femal disaggregating to male and female. This
Disaggregatio Mal
Female es means that the relative number of males
n by Sex (M/F) es
s: (boys) and females (girls) who will access
condom
Number of targeted individual in the baseline
Numbe
Disaggregatio and target columns. Target population who
r or Number
n by age (>14) were accessed condoms above 14 age
N/A
category.
Purpose: To track the percentage of newly
diagnosed HIV+ individuals linked to care and
% of newly
treatment service
diagnosed HIV
Calculation: Percentage of newly identified
Positive Numbe
HIV+ individuals linked to care and treatment
individuals r or Number
service over total tested population.
Linked to care N/A
Data Source: This data may include HIV+
and treatment
registration book or positive tracking sheet
16 service
and other related reporting tools and
reporting formats.
LIA-E will take base line data source by
# # disaggregating to male and female. This
Males : Femal
Disaggregatio Mal means that the relative number of males
Female es
n by Sex (M/F) es (boys) and females (girls) newly diagnosed
s:
individuals by percentage and linked for care
and treatment services
Number of targeted individual in the baseline
Numbe and target columns. Individual targeted
Disaggregatio
r or Number population newly diagnosed for HIV+ and
n by age (>0)
N/A linked to care and treatment service above
zero year
17 Number of Numbe Number Purpose: To track number of care provider
care provider r or trained on care and support service delivery
trained on N/A during program intervention.
care and Calculation: Number of care provider trained
support on care and support service delivery during
service program implementation
delivery Data Source: This data may include training
profile and reporting document that shows
the delivery of training on care and support
service
# LIA-E will take base line data source by
#
Males : Femal disaggregating to male and female. This
Disaggregatio Mal
Female es means that the relative number of males
n by Sex (M/F) es
s: (boys) and females (girls) who will trained on
care and support service delivery.
Number of targeted individual in the baseline
Numbe
Disaggregatio and target columns. Newly individuals
r or Number
n by age (>0) diagnosed for HIV+ and linked to care and
N/A
treatment service above zero year
Purpose: To track number of health service
Number of promotion centers established
Health service Numbe Calculation: Number of health service
promotion r or Number promotion centers established during project
centers N/A implementation
established Data Source: This data may include minutes
and capacity building reports conducted.
LIA-E will take base line data source by
18 # # disaggregating to male and female. This
Males : Femal
Disaggregatio Mal means that the relative number of males
Female es
n by Sex (M/F) es (boys) and females (girls) that are members
s:
of the established health service promotion
centers
Number of targeted individual in the baseline
Numbe
Disaggregatio and target columns. Individuals that are
r or Number
n by age (>18) involved in health service promotion centers
N/A
above 18 years
Number of Purpose: To track number of individuals
trained on community based HIV care and
individuals treatment
trained on Calculation: Number of individuals that was
Numbe trained on community based HIV care and
19 Community r or Number treatment in specific project implementation
Based HIV N/A period.
Data Source: This data may include capacity
care and building training profile and other source
treatment document related that was organized during
the reporting period.
Disaggregatio Males : # # LIA-E will take base line data source by
n by Sex (M/F) Female Mal Femal disaggregating to male and female. This
s: es es means that the relative number of males
(boys) and females (girls) trained on
community based HIV care and treatment
Number of targeted individual in the baseline
Numbe
Disaggregatio and target columns. Individuals that was
r or Number
n by age (>18) trained on community based HIV care and
N/A
treatment above 18 years.
Purpose: To track number of woreda SACs
Number of cover at least 50% of all kebeles in at least
Woreda SACs one sector.
cover at least Numbe Calculation: Number of woreda SACs cover at
50% of all r or Number least 50% of all kebeles in at least one sector
kebeles in at N/A that will be formed.
Data Source: This data may include list of
least one
SACs at woreda level including their profile at
sector all kebeles.
20
LIA-E will take base line data source by
# # disaggregating to male and female. This
Males : Femal
Disaggregatio Mal means that the relative number of males
Female es
n by Sex (M/F) es (boys) and females (girls) that were
s:
participate on SACs and run various micro
enterprise activities.
Numbe Number of targeted individual in the baseline
Disaggregatio
r or Number and target columns. Individual’s participant
n by age (>16)
N/A for SACs activity above 16 years.
Purpose: To track number of structured
Number of experience sharing events organized at
structured among contract woredas and key lessons
experience documented and implemented over project
sharing events implementation period.
organized at Numbe Calculation: Number of structured experience
among r or Number sharing events organized at among contract
contract N/A woredas and key lesson documented and
woredas, and implemented in specific project
key lessons implementation period.
21
documented Data Source: This data may include
and experience sharing documents and report
implemented conducted and minutes written
# LIA-E will take base line data source by
#
Males : Femal disaggregating to male and female. This
Disaggregatio Mal
Female es means that the relative number of males
n by Sex (M/F) es
s: (boys) and females (girls) who were
participate on experience sharing events
Numbe Number of targeted individual in the baseline
Disaggregatio
r or Number and target columns. Individual engaged on
n by age (>18)
N/A experience sharing workshop
V. DATA SOURCE AND COLLECTION PLAN

Data sources are sources of information used to collect data needed to calculate the indicators.
The data collection plan should include diagrams depicting the systems used for data collection,
processing, and analysis and reporting. The strength of these systems determines the validity of
the information obtained. Potential errors in data collection, or in the data themselves, must be
carefully considered when determining the usefulness of data sources

Data source

Data sources are the resources used to obtain data for M&E activities. There are several levels
from which data can come, including client, program, service environment, population, and
geographic levels. Regardless of level, data are commonly divided into two general categories:
routine and non-routine. Routine data sources provide data that are collected on a continuous
basis, such as information that clinics collect on the patients utilizing their services. Although
these data are collected continuously, processing them and reporting on them usually occur
only periodically, for instance, aggregated monthly and reported quarterly. Data collection from
routine sources is useful because it can provide information on a timely basis. For instance, it
can be used effectively to detect and correct problems in service delivery. However, it can be
difficult to obtain accurate estimates of catchment areas or target populations through this
method, and the quality of the data may be poor because of inaccurate record keeping or
incomplete reporting. Non- routine data sources provide data that are collected on a periodic
basis, usually annually or less frequently. Depending on the source, non-routine data can avoid
the problem of incorrectly estimating the target population when calculating coverage
indicators. This is particularly the case with representative population-based surveys, such as a
Demographic Health Survey (DHS). Non-routine data have two main limitations: collecting them
is often expensive, and this collection is done on an irregular basis. In order to make informed
program decisions, program managers usually need to receive data at more frequent intervals
than non-routine data can accommodate.

Data Collection

The M&E plan should include a data collection plan that summarizes information about the
data sources needed to monitor and evaluate the program. The plan should include information
for each data source, such as: the timing and frequency of collection; the person or agency
responsible for the collection; the information needed for the indicators; and any additional
information that will be obtained from the source.

Data Quality
Throughout the data collection process it is essential that data quality be monitored and
maintained. Data quality is important to consider when determining the usefulness of various
data sources; the data collected are most useful when they are of the highest quality.

It is important to use the highest quality data that are obtainable, but this often requires a
trade-off with what it is feasible to obtain. The highest quality data are usually obtained
through the triangulation of data from several sources. It is also important to remember that
behavioral and motivational factors on the part of the people collecting and analyzing the data
can also affect data quality.

VI. DELIVERABLES SCHEDULE

The deliverables schedule of the project provides important due dates for major undertakings
and requirements such as work plan submission, reporting, major evaluations, assessments and
other undertakings. The following are major deliverables of the project that would be produced
or achieved in the course of the implementation.
Deliverable Audience Date deliverable is Actual date Means of delivery
due (planned) it was
Quarterly Progress LIA-E Head Quarter 1- Dec. 20 25delivered
th
after the Electronic
Reports (Narrative) and Office Quarter 2- Mar. 20 due date
Monthly Progress Quarter 3- June 20
Reports (Quantitative Quarter 4- Sept. 20
report) Jan…December Next month Electronic
on 5th day
Quarterly Progress LIA-E Regional Quarter 1- Dec. 20 18th before Electronic
Reports (Narrative) and Offices Quarter 2- Mar. 20 due date
Monthly Progress Quarter 3- June 20
Reports (Quantitative Quarter 4- Sept. 20
report) Jan... December Next month Electronic
on 3rd day
Monthly Progress LIA-E Field Jan…December next month Electronic
Reports (Quantitative offices on 1st day
report)
Table. Deliverable Schedule

VII. DATA QUALITY MANAGEMENT PLAN

LIA-E and project team is very concerned with data quality and absolutely do not tolerate any
false generation of data to describe any of our partners or their data management processes --
to that end we work very closely with partners at all levels to enhance skills in data quality
management and ensure that information generated is of high quality. LIA-E will also provide
implementing partners’ M&E staff with 5 day training on data quality management as well on-
going one to one mentoring/orientation through regular site visits. During such visits, M&E
team will work with partners to review data collection and management procedures and
conduct routine data quality assessments in various sites. To enable to identification of sites
likely to have highest data quality risks, we will maintain a data management dashboard for all
partners that will enable quick access to key parameters related to tracking key results from the
program on a quarterly basis as well key parameters of data quality.

Operating procedures for monitoring data quality, data verification and validation
Ongoing monitoring data quality and skills building for risk assessment

LIA-E M&E team will work with partners to help them build skills in data Risk Assessment a
particularly essential skill for this program given that data will originate primarily from less
skilled community volunteers working on project intervention areas. Risk assessment helps
M&E professionals determine the potential level of error that may exist in the data and the
overall effect that error would have on their ability to use the data so they can prioritize their
time and resources in areas with higher risk. Through use of the data management tools,
quarterly data will be tracked by the LIA-E M&E team in regions and head office. The tool will
enable access to, key performance data in relation to program outputs for key results in
different sites as reported by the partners. Rating of data quality with regards to timeliness of
data, completeness and accuracy of data (particularly in relation to validity and reliability) will
be made.
The dashboard reflects rating based on a 3 point scale where 1 = high risk, 2=medium risk and
3=low risk. The figure below is an example of what the dashboard would look like for 3 partners
rated on the 3 parameters. Implementing partners are also encouraged and supported to
maintain similar tools to track quality of data received for their various sites.

Reporting Date ____________________

CSO Completenes
Region Accuracy Timeliness Average Comment
Partner s
A   1.7 1.5 1.0 1.4  
B   2.3 2.0 2.7 2.3  
C   2.5 2.0 2.7 2.4  
Figure .Program Data Quality Management Dashboard

Site Visits for mentoring partners and undertaking data verification and validation
LIA-E head office routinely undertakes site visits to its partners to provide mentoring and
examine data quality issues. These site visits are routinely undertaken in a joint manner with a
program staff member and the regional level staff with stakeholder and provide an opportunity
to review program operations at site. The site visits provide opportunity to individualize
technical assistance based on needs expressed by the partner or those identified during visit.
The M&E team will undertake site visits to each program/partner at least once every quarter.
However, project staff/partners experiencing data quality and management problems are
visited more often in order to provide the required technical assistance to resolve those issues.
Site visit checklist for monitoring implementation of Data Quality Management Systems at
various levels is presented.

Plan for Routine Data Quality Assessments


Routine Data Quality Assessment (RDQA) is essential procedure that provides an organization
with the means to determine the status of data quality at any given time and the opportunity to
develop and implement strategies to address existing gaps. LIA-E uses the RDQA process as
another opportunity to provide supportive supervision and mentoring to partner organizations
and its field workers.
The RDQA process focuses exclusively on
(1) Verifying the quality of reported data, and
(2) Assessing the underlying data management and reporting systems for standard program-
level output indicators.
As part of the routine data quality management, LIA-E M&E team will undertake RDQAs for
selected sites and CSO partners. These will enable the team to identify specific area for further
technical support to address key data quality challenges. The RDQA process will entail
undertaking assessments of data management and reporting systems at different levels
including the regional level, CSO partners and Kebele –Community volunteers. The program will
make use of the comprehensive RDQM tool initially developed by Measure Evaluation 1 together
with other partners and adapted by the LIA-E M&E team for use in the Ethiopian context.

1
Routine data quality tool, guidelines for implementation 2008 The Global Fund to Fight Aids, Tuberculosis and
Malaria, Office of the Global AIDS Coordinator, PEPFAR, USAID, WHO, UNAIDS, MEASURE Evaluation
Actor Major Duties and responsibilities
Complete beneficiary enrollment based on eligibility requirement
Volunteers

Document and submit data on home visit and services rendered


Collect need based assessment from clients
Complete referral forms and submit to task force

Documentation of selection process of volunteers


Force/QA

Maintaining of volunteer profiles and records


Task

Review and verify data received on beneficiary enrollment and service delivered
Review, verify and approve and referrals by volunteers
Review data from volunteers and collect all the forms reflecting services rendered
Undertake data verification and validation processes
Submit the original hard copies of the report to CSOs (where the database is found)
Facilitators
Field staff

Collect volunteer profile and collect records on support supervision


Collect data on community mobilization events, resources mobilized and cost share information
Submit reports on community mobilization events, resources mobilized and cost share to CSOs
Communicate meeting minutes/ proceedings to participants and to CSO.
Discuss referral issues and follow up the feedbacks of referral status and related documentation
Train and orient facilitators and volunteers on customized data collection format utilizations
Fiel
d

Collect data on training activities undertaken by the CSO


Avail all the necessary materials essential for data collection
supervisor Maintain files of copies of service delivery forms submitted by facilitators
verify the completeness and accuracy of the reports
staff

Undertake data entry into the electronic database


Submit reports to LIA-E regional offices, government bodies, Woreda facilitators
conduct regular filed monitoring visits & give technical assistances to facilitators and task force
Prepare monitoring report and feedback to facilitators, volunteers and task force
Collect reports from CSOs , review data/ report before forwarding to the next level
Routinely implement data verification processes
Regional
Offices

Capture data related to capacity development activities


Facilitate data Analysis at regional levels
Follow up and make sure report submitting deadlines observed
Consolidate, analyze data and produce regular reports to the donor and government
LIA-E Head

Facilitate/ undertaken project evaluations


Office

Conduct routine data quality assessment


Conduct joint monitoring visits and technical assistances
Document relevant project documents, best practices, success stories and other leanings
Table. Major data management duties and responsibilities
VIII. DATA ANALYSIS AND REPORTING PLAN

Data Analysis Plan


LIA-E project team has developed a data analysis and reporting plan that aims to ensure the
different key stakeholders at the different levels of program implementation are engaged and
participate in these processes. We often find that although organizations may succeed on
improving systems for collecting good quality data, they often place much less emphasis on
undertaking routine analysis. This is one of the commonest weaknesses that results in limited
use of data to generate useful information for decision making.

To ensure effective and regular data analysis, every project site will work closely with CSO
partners and government officials particularly at the
Woreda and regional levels to undertake analysis on a LIA-E key consideration when planning
quarterly basis. This will involve facilitation of data for data analysis is to determine and
analysis meetings/workshops for stakeholders during understand the purpose of the data
which a comprehensive review of key variables and collection effort; why do we collect the
data and what are we trying to learn.
interrogation of the data is undertaken. The analysis This then informs the following
will include review of disaggregated data generated questions:
from the database for key indicators. Another key  What type of analysis should we
variable for analysis is the geographic differences in do for each type of key variable?
reach for the different types of services. Such analysis  What groups of variables or
measurements do we want to
will be essential in informing planning processes as
compare?
the program implementation proceeds, as well  How precise do our findings need
provide means to identify areas that may require to be?
strengthening. Through working closely with and  What do our stakeholders want
mentoring our program partners, LIA-E will and need to know?
strengthen data analysis skills among both  What is the final product our
stakeholders will see?
government and CSOs, thereby improving overall
capacity to transform raw data into key strategic
information for decision making.

In addition to analysis of routine data, project team will undertake periodic analysis of non-
routine data as well as other data generated from evaluations of the program (baseline,
midterm and end).
Level Type of analysis/purpose Types of data comparisons for Frequency of
analysis Reporting
CSO The CSOs will analyze  Program reach by gender, Every month
service data to evaluate age-group, geographic
their performance (program differences
efficiency and effectiveness
 Performance of facilitators;
in meeting targets) within a
given period of time, to Volunteer to facilitator
documents lessons and ratios; types of support
extract best practices. supervision activities by
different facilitators
 Performance of community
Volunteers; beneficiary to
volunteer ratios; range of
services reported by
community volunteers
LIA-E Regional offices will analyze  Program reach by gender, Every month
Regio the extent to which CSOs in age-group, geographic Every quarter
nal their regions are meeting differences- data compared
offices performance targets. They
across CSOs and Woredas
will be interested in
comparing performance of  Performance of CSOs with
CSOs in order to identify regards to meeting targets
areas where additional and supporting Community
technical support may be volunteers. Number of
required. community structures
supported by CSO; number
of people trained by the
CSO.
LIA-E will analyze data from  Program reach by gender, Every quarter
LIA-E CSOs (both routine and non- age-group, geographic Annual
Head routine) to compare differences- data compared
office program performance
across CSOs, Woredas,
against the plan; to track
changes over the program Regions. Document lessons
delivery, examine the learnt about what
implementation standard influences performance.
guidelines and polices for  Analysis of differences in
program intervention areas. implementation of national
quality standards across
geographic regions or by
level of capacity of CSO
 Analysis of service delivery
results against established
needs for clients
Table. Illustrative Data Analysis at Different Levels

IX. MONITORING PLAN

The monitoring plan describes: specific program components that will be monitored, such as
provider performance or the utilization of resources; how this monitoring will be conducted;
and the indicators that will be used to measure results. Because monitoring is concerned with
the status of ongoing activities, output indicators, also known as process indicators, are used.
LIA-E M&E team will set monitoring plan to follow up activities performance progress and its
service quality, based on project behavior monitoring system. Indicators to be monitored might
be the following: Service delivery across the target plan in different program intervention.
Percentage of activity coverage during the reporting period, service quality while providing
support with various project activity. Capacity building activity for project actors or
stakeholders to sustain project/program activity. Training provision for the improvement of
quality service and for efficient project implementation. Resource utilization and program
management to run project activities in timely manner. This will be done monthly and quarterly
basis after service provision and reporting time.

X. EVALUATION PLAN

The evaluation plan provides the specific research design and methodological approaches to be
used to identify whether changes in outcomes can be attributed to the program. For instance, if
a program wants to test whether quality of patient care can be improved by training providers,
the evaluation plan would identify a research design that could be used to measure the impact
of such an intervention. One way this could be investigated would be through a quasi-
experimental design in which providers in one facility are given a pretest, followed by the
training and a post-test. For comparison purposes, a similar group of providers from another
facility would be given the same pretest and post-test, without the intervening training. Then
the test results would be compared to determine the impact of the training.

XI. INFORMATION DISSEMINATION AND USE

Collecting data is only meaningful and worthwhile if it is subsequently used for evidence-based
decision-making. To be useful, information must be based on quality data, and it also must be
communicated effectively to policy makers and other interested stakeholders. M&E data need
to be manageable and timely, reliable, and specific to the activities in question. Additionally,
the results need to be well understood. The key to effective data use involves linking the data
to the decisions that need to be made and to those making these decisions. The decision-maker
needs to be aware of relevant information in order to make informed decisions. For example, if
sales data from a program to provide insecticide-treated bed-nets show that the program is
successfully increasing bed-net distribution, the decision maker may decide to maintain the
program as it is. Alternatively, the data may prompt the implementation of a new distribution
system and could spur additional research to test the effectiveness of this new strategy
compared to the existing one.

LIA-E M&E team disseminate project report and activity to relevant government stakeholders
and donors in regular periodic interval. Information gathered will be stored, disseminated and
used should be defined at the planning stage of the project and described in the M&E plan. This
will help and ensure that findings from M&E efforts are not wasted because they are not
shared. The various users of this information should be clearly defined, and the reports should
be written with specific audiences in mind. Dissemination channels can include written reports,
press releases and stories in the mass media, and speaking events. When decision-makers
understand the kinds of information that can be used to inform decisions and improve results,
they are more likely to seek out and use this information.

Narrative reports reflecting key activities extracted from database and other source of
reporting, challenges, success stories and lessons learnt will be developed by LIA-E regional
offices and as well by head office implementation team. These reports will be essential in
informing key stakeholders and USAID/ other donor agency about program implementation
and achievements. Consolidated report from the regional offices will be submitted to Program
and M&E Unit monthly and quarterly based on its importance priority.

Stakeholders Data Recipients Data Requirement Frequency of


Reporting
Community  Facilitator (CSO) - Clients enrollment format Every month
Volunteers  Task Force - Need assessment data
- Client service records
- Client Referral Slips
Field  CSO - Compilation of volunteers Every month
Facilitator  Government service delivery records
stakeholder - Community training/awareness
reports
- Support supervision reports

LIA-E  LIA-E - Clients Database report Every month


Regional  Woreda - Narrative performance report Every quarter
Offices concerned body - Capacity building trainings
reports
- Best practices and success
stories
LIA-E Head  USAID - Report on key PEPFAR /USAID Every quarter
Office  Government and other donor agency
 Other donor indicators Annual
agency - Quarter-based progress report
- Program review reports
providing analysis of non-
routine data
Table. LIA-E Reporting Matrix on each program intervention
XII. IMPLEMENTATION AND MECHANISM FOR UPDATE

The capacities needed to implement the efforts described in the M&E plan should be included
in the document. A mechanism for reviewing and updating the M&E plan should also be
included. This is because changes in the program can and will affect the original plans for both
monitoring and evaluation.

In case of different reason activities planned and target set will not be accomplished during the
reporting period of program/project implementation. Through monitoring performance
progress report and site supervision in relation to environmental factors program management
and QA team will discuss monthly and quarterly on activity progress and set precondition to
avoid any compliance issue. After all each project staff will follow, guide and mentor on gaps
observed to maintain the expected delivery within the required time.

XIII. M&E STANDARDS


The organization consider and give attention to the following monitoring and evaluation
standards for effective program implementation and facilitation.

M&E standards Definition


M&E is integrated into programs.
Routine M&E is part of each program and implementation approach,
starting with a sufficient budget allocated for M&E. M&E, program,
and technical staff work closely to ensure high-quality monitoring
and continuous data use through Program and technical refinement.
The program has adequate M&E Adequate skilled human capacity exists at all levels of the M&E
human system to complete all tasks defined in the program M&E work plan
capacity and is supported by human resources management practices.

Each program has a logic model, For each program, a logic model, PMP, and M&E work plan are
Performance Monitoring (or developed, implemented, and updated annually
Management) Plan (PMP) and M&E
work plan
Routine program monitoring systems Routine data formats, collection, flow, quality control, and entry that
are in place and maintained. avoids double-counting are in place and followed by each program
component or sub-grantee
The program uses databases to The organization’s centralized reporting system, training
facilitate routine M&E data capture management system, and other databases are regularly updated,
and analysis. and the database reports are used for donor reporting and to guide
program decision-making.
The program assesses and maintains Methods for updating and maintaining high-quality data are in place
high data quality. and used.
The program plans and implements Activity related to evaluation research is included in annual work
rigorous evaluations. plans and conducted accordingly.
The program uses and shares program The program has a data analysis and dissemination plan in place and
data for decision-making routinely shares and uses data for decision-making.
The office has an M&E plan outlining An M&E plan is developed and regularly updated to reflect the M&E
standard operating procedures. needs across all programs. This includes human and material
resources, data collection procedures, timelines, and roles and
responsibilities for implementation of M&E systems to support all
programs.

ANNEX A: ACTIVITY PERFORMANCE INDICATOR REFERENCE SHEET


Performance Indicator Reference Sheets (PIRS) should be completed for all indicators in the
AMEP.

Data Sources
Guidance: In terms of data sources, systems, procedures, tools, and collection methodology the
M&E plan should describe how data quality will be assured as to:

VALIDITY: The data should clearly and adequately represent the intended result
INTEGRITY: The data should have safeguards to minimize the risk of transcription error or data
manipulation
PRECISION: The data should have a sufficient level of detail to permit management decision-
making
RELIABILITY: The data should reflect consistent collection processes and analysis methods over
time
TIMELINESS: Data should be available at a useful frequency, be current, and timely enough to
influence management decision-making.)
Data Collection Methodology
Guidance: This section describes in detail who is responsible for data collection and
management (M&E manager, technical specialists, others), and in what format (database,
spreadsheets, GIS) data will be managed, and who is responsible for producing which reports.
Aspects of quality control at all stages should be described. Relevant details about types of
data collection issues such as sampling, tool design, use of sub-contractors and project staff for
data collection, etc. will go here. The specific methods used to collect data for the specified
indicators are described in detail in each PIRS. Each indicators described under indicator target
table should be emphasized under this PIRS. Based on awards of different projects/programs
indicator target table and PIRS will be updated consistently as the context of the
project/program.

Performance Indicator Reference Sheet


Name of Activity Development Objective (or Goal or Purpose):
Name of Activity Intermediate Result:

Name of Activity Sub-Intermediate Result:

Name of Indicator:
Geographic focus:
DESCRIPTION
USAID/other Definition (if applicable):
Precise Definition(s):
Unit of Measure:
Method of calculation:
Disaggregated by:
Justification & Management Utility:
PLAN FOR DATA ACQUISITION
Data Collection Method:
Data Source(s):
Method of transfer to partners:
Frequency & Timing of Data Acquisition:
Estimated Cost of Data Acquisition:
Individual Responsible (title):
Individual Responsible for providing data to partners:
Location of data storage:
DATA QUALITY ISSUES
Date of Initial Data Quality Assessment:
Known Data Limitations and Significance (if any):
Actions Taken or Planned to Address Data Limitations:
Date of Future Data Quality Assessments:
Procedures for Future Data Quality Assessments:
PLAN FOR DATA ANALYSIS, REVIEW, & REPORTING
Data Analysis:
Presentation of Data:
Review of Data:
Reporting of Data:
OTHER NOTES
Notes on Baselines/Targets:
Other Notes:
PERFORMANCE INDICATOR VALUES
Year Baseline Target Actual Note/Comment
20--
20--
THIS SHEET LAST UPDATED ON: / /
Figure: Performance Indicator Reference Sheet

Annex B: DATA QUALITY ASSESSMENT FORM


Note: LIA-E and the M&E team is responsible to ensure data quality assessments for all projects
are conducted as required. Also conduct a formal DQA for all indicators that are reported to
USAID or other donor Agency and this must be done every quarter. Moreover quality
procedures is in place to ensure that performance reporting data meets DQA standard criteria.
The Data Quality checklist form is provided on the following pages.
Data Quality Checklist
Project/Activity Name:
Title of Performance Indicator:
[Indicator should be copied directly from the Performance Indicator Reference Sheet]
Linkage to Foreign Assistance Standardized Program Structure, if applicable (i.e. Program Area, Element, etc.):
Result This Indicator Measures (i.e., Specify the Development Objective, Intermediate Result, or Project Purpose,
etc.):
Data Source(s):
[Information can be copied directly from the Performance Indicator Reference Sheet]
Period for Which the Data Are Being Reported:
Is This Indicator a Standard or Custom Indicator? ____ Standard Foreign Assistance Indicator
____ Custom (created by the OU; not standard)
Is this indicator a required USAID indicator? ____ Y
____ N
Data Quality Assessment methodology:
[Describe here or attach to this checklist the methods and procedures for assessing the quality of the indicator
data. E.g. Reviewing data collection procedures and documentation, interviewing those responsible for data
analysis, checking a sample of the data for errors, etc.]
Date(s) of Assessment:
Assessment conducted by:

Not Applicable/
Category Y N Insufficient Comments
information
Validity
Does the indicator reflect the intended results of the
activity – i.e. is it a useful indicator for activity
management?
Do the data being collected and reported match the
intent or language of the indicator?
Are the data collection methods (interviews,
observation, etc.) appropriate to produce good data?

Are the data collection procedures and/or sources


relatively free of bias?
Are the people collecting the data qualified and/or
adequately experienced?
Are the people collecting the data properly
supervised?
Reliability
Are the definitions and procedures for data
collection, calculation and reporting clear and well
understood by all relevant staff?
Do the definitions and procedures for collecting and
calculating the data match the Mission PIRS if
applicable?
If not, please describe the differences.
Are data collection and analysis methods
documented in writing in a PIRS or another form?
Is a consistent data collection process used from
(describe any changes/differences observed if N):
Year to year?
Not Applicable/
Category Y N Insufficient Comments
information
In all activity locations/sites?
By all activity partners/sub-contractors?
Are there procedures in place for periodic review of
data collection, maintenance, and processing that can
detect data quality issues?
Has the partner identified significant data quality
limitations in the past?
Were these communicated to donor? If yes, describe
how.
Have these data quality limitations been addressed
by the partner? If yes, explain how.
Has the partner identified significant data quality
limitations in current data? If yes, please describe.
Are these limitations described in the indicator PIRS
or written data collection and analysis procedures? If
yes, please describe.
Are these limitations described in reporting to USAID/
other donor agency? If yes, please describe.
Timeliness
Are the data for this indicator reported to USAID/
other donor agency by the method (ex. Quarterly
Performance Data Table) and frequency required?
Is this format and schedule appropriate for
project/activity management? If no, describe how it
should be changed,
Precision
Is there a method for detecting duplicate data? If yes,
please describe.
If there is duplication of data, is the level of
duplication acceptable for this indicator? Describe
why or why not.
If there is unacceptable duplication of data, is it
identified in the PIRS under data limitations or
Not Applicable/
Category Y N Insufficient Comments
information
another section?
If there is unacceptable duplication of data, has
information on duplication been shared with
USAID/other donor agency? Describe how.
Is there a method for detecting missing data? If yes,
please describe.
If there are missing data, is the level acceptable for
this indicator? Describe why or why not.
If there are unacceptable amounts of missing data, is
this identified in the PIRS under data limitations or
another section?
If there are unacceptable amounts of missing data,
has information on missing data been shared with
USAID? Describe how.
Are the reported data disaggregated according to
USAID guidance?
Integrity
Are there procedures in place to check for
transcription errors at all levels of the data collection
and reporting system?
Are there proper safeguards in place to prevent
unauthorized changes to the data?
Are there procedures in place to ensure unbiased
analysis of data and subsequent reporting?
Are their safeguards in place to ensure that all
relevant tools, tracking sheets and data are backed up
and protected from data loss?
IF NO DATA ARE AVAILABLE FOR THE INDICATOR COMMENTS
If no recent relevant data are available for this
indicator, why not?
What concrete actions are now being taken to collect
and report these data as soon as possible or on
schedule?
When will data be reported?
SUMMARY (where multiple items are listed by the assessor in each row, they should be numbered so that it is
clear what recommendations apply to which limitations)
Based on the assessment above, what is the overall conclusion regarding the quality of the data?

What limitations, if any, were observed and what actions should be taken to address these limitations?

Final agreed upon actions and timeframe needed to address limitations prior to the next DQA:

ANNEX C: DATA COLLECTION TOOLS


Add as an annex any forms used to collect data; data vetting procedures, survey questionnaires used,
sample design information.

You might also like