Professional Documents
Culture Documents
LIAE M&E Guideline
LIAE M&E Guideline
M and E Guideline
August 2019
Addis Ababa
Table of Contents
1. Introduction............................................................................................
2. Program Description and Framework....................................................
3. Detailed Description Of Plan Indicators................................................
4. Indicator Target Table.....…………………………………………………………….
6. Deliverable Schedule…………………………………………………………….........
I. INTRODUCTION
The introduction to the M&E plan should include: information about the purpose of the
program, the specific M&E activities that are needed and why they are important; and a
development history that provides information about the motivations of the internal and
external stakeholders and the extent of their interest, commitment and participation.
Project intervention should have monitoring and evaluation (M&E) plan. This is the
fundamental document that details a program’s objectives, the interventions developed to
achieve these objectives and describes the procedures that will be implemented to determine
whether or not the objectives are met. It shows how the expected results of a program relate
to its goals and objectives, describes the data needed and how these data will be collected and
analyzed, how this information will be used, the resources that will be needed, and how the
program will be accountable to stakeholders. M&E plans should be created during the design
phase of a program and can be organized in a variety of ways. Typically, they include: the
underlying assumptions on which the achievement of program goals depend; the anticipated
relationships between activities, outputs, and outcomes; well-defined conceptual measures and
definitions, along with baseline values; the monitoring schedule; a list of data sources to be
used; cost estimates for the M&E activities; a list of the partnerships and collaborations that will
help achieve the desired results; and a plan for the dissemination and utilization of the
information gained.
State how a program will measure its achievements and therefore provide accountability;
document consensus and provide transparency; guide the implementation of M&E activities in
a standardized and coordinated way; and preserve institutional memory.
Sub-IR (S-IR) from the Sub-IR from the country Sub-IR from the country Sub-IR related to the
country S-IR which will S-IR which will be your S-IR which will be your country S-IR which will be
be your sub-outcome/sub-purpose sub-outcome/sub-purpose your sub-outcome/sub-
sub-outcome/sub-purpose in the LF in the LF purpose in the LF
in the LF
Key inputs required to undertake activities; should mirror the inputs as they appear in the logical
framework
LOGICAL FRAMEWORK
Logical framework is a methodology for analyzing the context of a proposed project
intervention, formulating objectives and defining criteria that can be used to measure its
impact. Its Analysis starts out from the idea that the actions of the project will produce,
changes that can be predicted, observed and measured at least to a certain extent . The
organization will also follow this type of framework in case of need based project nature.
Inputs: Inputs (Nature and level of Sources of information; Initial assumptions about the
resources necessary ‘cost’ Methods use project
Planned starting date)
III. INDICATORS
Indicators are clues, signs or markers that measure one aspect of a program and show how
close a program is to its desired path and outcomes. They are used to provide benchmarks for
demonstrating the achievements of a program. One of the most critical steps in designing an
M&E system is selecting appropriate indicators. The M&E plan should include descriptions of
the indicators that will be used to monitor program implementation and achievement of the
goals and objectives.
An indicator is a variable that measures one aspect of a program or project that is directly
related to the program’s objectives. Let’s take a moment to go over each piece of this
definition. An indicator is a variable whose value changes from the baseline level at the time
the program began to a new value after the program and its activities have made their impact
felt. At that point, the variable, or indicator, is calculated again. Secondly, an indicator is a
measurement. It measures the value of the change in meaningful units that can be compared to
past and future units. This is usually expressed as a percentage or a number. Finally, an
indicator focuses on a single aspect of a program or project. This aspect may be an input, an
output or an overarching objective, but it should be narrowly defined in a way that captures
this one aspect as precisely as possible. A reasonable guideline recommends one or two
indicators per result, at least one indicator for each activity.
Indicators can be either be quantitative or qualitative. Quantitative indicators are numeric and
are presented as numbers or percentages. Qualitative indicators are descriptive observations
and can be used to supplement the numbers and percentages provided by quantitative
indicators. They complement quantitative indicators by adding a richness of information about
the context in which the program has been operating. Examples include “availability of a clear,
strategic organizational mission statement” and “existence of a multi-year procurement plan
for each product offered.”
IV. INDICATOR TARGET TABLE
S/ Baselin
Indicator Target Purpose, Calculation and Data Source
N e
Percentage of Purpose: It’s to track percentage of eligible
children who was enrolled to school.
eligible Calculation: This data includes children’s that
children who Numbe are enrolled for school services
r or Number
enrolled to Data Source: Data sources are school
N/A
registration and other educational institution
school that trace eligibility of children for school
enrollment.
1
LIA-E will take base line data source by
disaggregation concerning to this specific
Males : # #
Disaggregatio indicator. This means that the relative
Female Mal Femal
n by Sex (M/F) number of males (boys) and females (girls)
s: es es
enrolled to school. If not, use the ratio from a
related school facility.
Disaggregatio Numbe Its number of children in the stated age range
n by age (0- r or Number in the baseline and target columns. Adults
18) N/A are not to be included in this disaggregation.
Purpose: Its purpose is to identify percentage
Percentage of of girls student who were attending school
Calculation: Students record or report who
eligible girls
Numbe are attending the school from all students
attending r or Number that were at school.
N/A Data Source: This data may include school
school
attendance sheet information or other
relative document source used for
attendance purpose electronically/manually.
2
LIA-E will take base line data source by
# # disaggregating to male and female specific
Males : Femal
Disaggregatio Mal sex category. This means that the relative
Female es
n by sex(M/F) es number of males (boys) and females (girls)
s:
attending school. If not, use the ratio from a
related school facility.
Disaggregatio Numbe Number of children in stated age range in the
n by age (0- r or Number baseline and target columns. Adults are not
18) N/A to be included in this disaggregation.
3 Number of Numbe Number Purpose: To track number of students that
Schools r or are drop out from school in case of different
dropout rate N/A reason.
Calculation: This is an incremental/decrement
of drop out students number during the
semester or the academic year.
Data Source: This data may include students
roster who is not completing the academic
year or semester and also school attendance
Data sources are sources of information used to collect data needed to calculate the indicators.
The data collection plan should include diagrams depicting the systems used for data collection,
processing, and analysis and reporting. The strength of these systems determines the validity of
the information obtained. Potential errors in data collection, or in the data themselves, must be
carefully considered when determining the usefulness of data sources
Data source
Data sources are the resources used to obtain data for M&E activities. There are several levels
from which data can come, including client, program, service environment, population, and
geographic levels. Regardless of level, data are commonly divided into two general categories:
routine and non-routine. Routine data sources provide data that are collected on a continuous
basis, such as information that clinics collect on the patients utilizing their services. Although
these data are collected continuously, processing them and reporting on them usually occur
only periodically, for instance, aggregated monthly and reported quarterly. Data collection from
routine sources is useful because it can provide information on a timely basis. For instance, it
can be used effectively to detect and correct problems in service delivery. However, it can be
difficult to obtain accurate estimates of catchment areas or target populations through this
method, and the quality of the data may be poor because of inaccurate record keeping or
incomplete reporting. Non- routine data sources provide data that are collected on a periodic
basis, usually annually or less frequently. Depending on the source, non-routine data can avoid
the problem of incorrectly estimating the target population when calculating coverage
indicators. This is particularly the case with representative population-based surveys, such as a
Demographic Health Survey (DHS). Non-routine data have two main limitations: collecting them
is often expensive, and this collection is done on an irregular basis. In order to make informed
program decisions, program managers usually need to receive data at more frequent intervals
than non-routine data can accommodate.
Data Collection
The M&E plan should include a data collection plan that summarizes information about the
data sources needed to monitor and evaluate the program. The plan should include information
for each data source, such as: the timing and frequency of collection; the person or agency
responsible for the collection; the information needed for the indicators; and any additional
information that will be obtained from the source.
Data Quality
Throughout the data collection process it is essential that data quality be monitored and
maintained. Data quality is important to consider when determining the usefulness of various
data sources; the data collected are most useful when they are of the highest quality.
It is important to use the highest quality data that are obtainable, but this often requires a
trade-off with what it is feasible to obtain. The highest quality data are usually obtained
through the triangulation of data from several sources. It is also important to remember that
behavioral and motivational factors on the part of the people collecting and analyzing the data
can also affect data quality.
The deliverables schedule of the project provides important due dates for major undertakings
and requirements such as work plan submission, reporting, major evaluations, assessments and
other undertakings. The following are major deliverables of the project that would be produced
or achieved in the course of the implementation.
Deliverable Audience Date deliverable is Actual date Means of delivery
due (planned) it was
Quarterly Progress LIA-E Head Quarter 1- Dec. 20 25delivered
th
after the Electronic
Reports (Narrative) and Office Quarter 2- Mar. 20 due date
Monthly Progress Quarter 3- June 20
Reports (Quantitative Quarter 4- Sept. 20
report) Jan…December Next month Electronic
on 5th day
Quarterly Progress LIA-E Regional Quarter 1- Dec. 20 18th before Electronic
Reports (Narrative) and Offices Quarter 2- Mar. 20 due date
Monthly Progress Quarter 3- June 20
Reports (Quantitative Quarter 4- Sept. 20
report) Jan... December Next month Electronic
on 3rd day
Monthly Progress LIA-E Field Jan…December next month Electronic
Reports (Quantitative offices on 1st day
report)
Table. Deliverable Schedule
LIA-E and project team is very concerned with data quality and absolutely do not tolerate any
false generation of data to describe any of our partners or their data management processes --
to that end we work very closely with partners at all levels to enhance skills in data quality
management and ensure that information generated is of high quality. LIA-E will also provide
implementing partners’ M&E staff with 5 day training on data quality management as well on-
going one to one mentoring/orientation through regular site visits. During such visits, M&E
team will work with partners to review data collection and management procedures and
conduct routine data quality assessments in various sites. To enable to identification of sites
likely to have highest data quality risks, we will maintain a data management dashboard for all
partners that will enable quick access to key parameters related to tracking key results from the
program on a quarterly basis as well key parameters of data quality.
Operating procedures for monitoring data quality, data verification and validation
Ongoing monitoring data quality and skills building for risk assessment
LIA-E M&E team will work with partners to help them build skills in data Risk Assessment a
particularly essential skill for this program given that data will originate primarily from less
skilled community volunteers working on project intervention areas. Risk assessment helps
M&E professionals determine the potential level of error that may exist in the data and the
overall effect that error would have on their ability to use the data so they can prioritize their
time and resources in areas with higher risk. Through use of the data management tools,
quarterly data will be tracked by the LIA-E M&E team in regions and head office. The tool will
enable access to, key performance data in relation to program outputs for key results in
different sites as reported by the partners. Rating of data quality with regards to timeliness of
data, completeness and accuracy of data (particularly in relation to validity and reliability) will
be made.
The dashboard reflects rating based on a 3 point scale where 1 = high risk, 2=medium risk and
3=low risk. The figure below is an example of what the dashboard would look like for 3 partners
rated on the 3 parameters. Implementing partners are also encouraged and supported to
maintain similar tools to track quality of data received for their various sites.
CSO Completenes
Region Accuracy Timeliness Average Comment
Partner s
A 1.7 1.5 1.0 1.4
B 2.3 2.0 2.7 2.3
C 2.5 2.0 2.7 2.4
Figure .Program Data Quality Management Dashboard
Site Visits for mentoring partners and undertaking data verification and validation
LIA-E head office routinely undertakes site visits to its partners to provide mentoring and
examine data quality issues. These site visits are routinely undertaken in a joint manner with a
program staff member and the regional level staff with stakeholder and provide an opportunity
to review program operations at site. The site visits provide opportunity to individualize
technical assistance based on needs expressed by the partner or those identified during visit.
The M&E team will undertake site visits to each program/partner at least once every quarter.
However, project staff/partners experiencing data quality and management problems are
visited more often in order to provide the required technical assistance to resolve those issues.
Site visit checklist for monitoring implementation of Data Quality Management Systems at
various levels is presented.
1
Routine data quality tool, guidelines for implementation 2008 The Global Fund to Fight Aids, Tuberculosis and
Malaria, Office of the Global AIDS Coordinator, PEPFAR, USAID, WHO, UNAIDS, MEASURE Evaluation
Actor Major Duties and responsibilities
Complete beneficiary enrollment based on eligibility requirement
Volunteers
Review and verify data received on beneficiary enrollment and service delivered
Review, verify and approve and referrals by volunteers
Review data from volunteers and collect all the forms reflecting services rendered
Undertake data verification and validation processes
Submit the original hard copies of the report to CSOs (where the database is found)
Facilitators
Field staff
To ensure effective and regular data analysis, every project site will work closely with CSO
partners and government officials particularly at the
Woreda and regional levels to undertake analysis on a LIA-E key consideration when planning
quarterly basis. This will involve facilitation of data for data analysis is to determine and
analysis meetings/workshops for stakeholders during understand the purpose of the data
which a comprehensive review of key variables and collection effort; why do we collect the
data and what are we trying to learn.
interrogation of the data is undertaken. The analysis This then informs the following
will include review of disaggregated data generated questions:
from the database for key indicators. Another key What type of analysis should we
variable for analysis is the geographic differences in do for each type of key variable?
reach for the different types of services. Such analysis What groups of variables or
measurements do we want to
will be essential in informing planning processes as
compare?
the program implementation proceeds, as well How precise do our findings need
provide means to identify areas that may require to be?
strengthening. Through working closely with and What do our stakeholders want
mentoring our program partners, LIA-E will and need to know?
strengthen data analysis skills among both What is the final product our
stakeholders will see?
government and CSOs, thereby improving overall
capacity to transform raw data into key strategic
information for decision making.
In addition to analysis of routine data, project team will undertake periodic analysis of non-
routine data as well as other data generated from evaluations of the program (baseline,
midterm and end).
Level Type of analysis/purpose Types of data comparisons for Frequency of
analysis Reporting
CSO The CSOs will analyze Program reach by gender, Every month
service data to evaluate age-group, geographic
their performance (program differences
efficiency and effectiveness
Performance of facilitators;
in meeting targets) within a
given period of time, to Volunteer to facilitator
documents lessons and ratios; types of support
extract best practices. supervision activities by
different facilitators
Performance of community
Volunteers; beneficiary to
volunteer ratios; range of
services reported by
community volunteers
LIA-E Regional offices will analyze Program reach by gender, Every month
Regio the extent to which CSOs in age-group, geographic Every quarter
nal their regions are meeting differences- data compared
offices performance targets. They
across CSOs and Woredas
will be interested in
comparing performance of Performance of CSOs with
CSOs in order to identify regards to meeting targets
areas where additional and supporting Community
technical support may be volunteers. Number of
required. community structures
supported by CSO; number
of people trained by the
CSO.
LIA-E will analyze data from Program reach by gender, Every quarter
LIA-E CSOs (both routine and non- age-group, geographic Annual
Head routine) to compare differences- data compared
office program performance
across CSOs, Woredas,
against the plan; to track
changes over the program Regions. Document lessons
delivery, examine the learnt about what
implementation standard influences performance.
guidelines and polices for Analysis of differences in
program intervention areas. implementation of national
quality standards across
geographic regions or by
level of capacity of CSO
Analysis of service delivery
results against established
needs for clients
Table. Illustrative Data Analysis at Different Levels
The monitoring plan describes: specific program components that will be monitored, such as
provider performance or the utilization of resources; how this monitoring will be conducted;
and the indicators that will be used to measure results. Because monitoring is concerned with
the status of ongoing activities, output indicators, also known as process indicators, are used.
LIA-E M&E team will set monitoring plan to follow up activities performance progress and its
service quality, based on project behavior monitoring system. Indicators to be monitored might
be the following: Service delivery across the target plan in different program intervention.
Percentage of activity coverage during the reporting period, service quality while providing
support with various project activity. Capacity building activity for project actors or
stakeholders to sustain project/program activity. Training provision for the improvement of
quality service and for efficient project implementation. Resource utilization and program
management to run project activities in timely manner. This will be done monthly and quarterly
basis after service provision and reporting time.
X. EVALUATION PLAN
The evaluation plan provides the specific research design and methodological approaches to be
used to identify whether changes in outcomes can be attributed to the program. For instance, if
a program wants to test whether quality of patient care can be improved by training providers,
the evaluation plan would identify a research design that could be used to measure the impact
of such an intervention. One way this could be investigated would be through a quasi-
experimental design in which providers in one facility are given a pretest, followed by the
training and a post-test. For comparison purposes, a similar group of providers from another
facility would be given the same pretest and post-test, without the intervening training. Then
the test results would be compared to determine the impact of the training.
Collecting data is only meaningful and worthwhile if it is subsequently used for evidence-based
decision-making. To be useful, information must be based on quality data, and it also must be
communicated effectively to policy makers and other interested stakeholders. M&E data need
to be manageable and timely, reliable, and specific to the activities in question. Additionally,
the results need to be well understood. The key to effective data use involves linking the data
to the decisions that need to be made and to those making these decisions. The decision-maker
needs to be aware of relevant information in order to make informed decisions. For example, if
sales data from a program to provide insecticide-treated bed-nets show that the program is
successfully increasing bed-net distribution, the decision maker may decide to maintain the
program as it is. Alternatively, the data may prompt the implementation of a new distribution
system and could spur additional research to test the effectiveness of this new strategy
compared to the existing one.
LIA-E M&E team disseminate project report and activity to relevant government stakeholders
and donors in regular periodic interval. Information gathered will be stored, disseminated and
used should be defined at the planning stage of the project and described in the M&E plan. This
will help and ensure that findings from M&E efforts are not wasted because they are not
shared. The various users of this information should be clearly defined, and the reports should
be written with specific audiences in mind. Dissemination channels can include written reports,
press releases and stories in the mass media, and speaking events. When decision-makers
understand the kinds of information that can be used to inform decisions and improve results,
they are more likely to seek out and use this information.
Narrative reports reflecting key activities extracted from database and other source of
reporting, challenges, success stories and lessons learnt will be developed by LIA-E regional
offices and as well by head office implementation team. These reports will be essential in
informing key stakeholders and USAID/ other donor agency about program implementation
and achievements. Consolidated report from the regional offices will be submitted to Program
and M&E Unit monthly and quarterly based on its importance priority.
The capacities needed to implement the efforts described in the M&E plan should be included
in the document. A mechanism for reviewing and updating the M&E plan should also be
included. This is because changes in the program can and will affect the original plans for both
monitoring and evaluation.
In case of different reason activities planned and target set will not be accomplished during the
reporting period of program/project implementation. Through monitoring performance
progress report and site supervision in relation to environmental factors program management
and QA team will discuss monthly and quarterly on activity progress and set precondition to
avoid any compliance issue. After all each project staff will follow, guide and mentor on gaps
observed to maintain the expected delivery within the required time.
Each program has a logic model, For each program, a logic model, PMP, and M&E work plan are
Performance Monitoring (or developed, implemented, and updated annually
Management) Plan (PMP) and M&E
work plan
Routine program monitoring systems Routine data formats, collection, flow, quality control, and entry that
are in place and maintained. avoids double-counting are in place and followed by each program
component or sub-grantee
The program uses databases to The organization’s centralized reporting system, training
facilitate routine M&E data capture management system, and other databases are regularly updated,
and analysis. and the database reports are used for donor reporting and to guide
program decision-making.
The program assesses and maintains Methods for updating and maintaining high-quality data are in place
high data quality. and used.
The program plans and implements Activity related to evaluation research is included in annual work
rigorous evaluations. plans and conducted accordingly.
The program uses and shares program The program has a data analysis and dissemination plan in place and
data for decision-making routinely shares and uses data for decision-making.
The office has an M&E plan outlining An M&E plan is developed and regularly updated to reflect the M&E
standard operating procedures. needs across all programs. This includes human and material
resources, data collection procedures, timelines, and roles and
responsibilities for implementation of M&E systems to support all
programs.
Data Sources
Guidance: In terms of data sources, systems, procedures, tools, and collection methodology the
M&E plan should describe how data quality will be assured as to:
VALIDITY: The data should clearly and adequately represent the intended result
INTEGRITY: The data should have safeguards to minimize the risk of transcription error or data
manipulation
PRECISION: The data should have a sufficient level of detail to permit management decision-
making
RELIABILITY: The data should reflect consistent collection processes and analysis methods over
time
TIMELINESS: Data should be available at a useful frequency, be current, and timely enough to
influence management decision-making.)
Data Collection Methodology
Guidance: This section describes in detail who is responsible for data collection and
management (M&E manager, technical specialists, others), and in what format (database,
spreadsheets, GIS) data will be managed, and who is responsible for producing which reports.
Aspects of quality control at all stages should be described. Relevant details about types of
data collection issues such as sampling, tool design, use of sub-contractors and project staff for
data collection, etc. will go here. The specific methods used to collect data for the specified
indicators are described in detail in each PIRS. Each indicators described under indicator target
table should be emphasized under this PIRS. Based on awards of different projects/programs
indicator target table and PIRS will be updated consistently as the context of the
project/program.
Name of Indicator:
Geographic focus:
DESCRIPTION
USAID/other Definition (if applicable):
Precise Definition(s):
Unit of Measure:
Method of calculation:
Disaggregated by:
Justification & Management Utility:
PLAN FOR DATA ACQUISITION
Data Collection Method:
Data Source(s):
Method of transfer to partners:
Frequency & Timing of Data Acquisition:
Estimated Cost of Data Acquisition:
Individual Responsible (title):
Individual Responsible for providing data to partners:
Location of data storage:
DATA QUALITY ISSUES
Date of Initial Data Quality Assessment:
Known Data Limitations and Significance (if any):
Actions Taken or Planned to Address Data Limitations:
Date of Future Data Quality Assessments:
Procedures for Future Data Quality Assessments:
PLAN FOR DATA ANALYSIS, REVIEW, & REPORTING
Data Analysis:
Presentation of Data:
Review of Data:
Reporting of Data:
OTHER NOTES
Notes on Baselines/Targets:
Other Notes:
PERFORMANCE INDICATOR VALUES
Year Baseline Target Actual Note/Comment
20--
20--
THIS SHEET LAST UPDATED ON: / /
Figure: Performance Indicator Reference Sheet
Not Applicable/
Category Y N Insufficient Comments
information
Validity
Does the indicator reflect the intended results of the
activity – i.e. is it a useful indicator for activity
management?
Do the data being collected and reported match the
intent or language of the indicator?
Are the data collection methods (interviews,
observation, etc.) appropriate to produce good data?
What limitations, if any, were observed and what actions should be taken to address these limitations?
Final agreed upon actions and timeframe needed to address limitations prior to the next DQA: