Professional Documents
Culture Documents
Approaches To RBM TEN Steps 20th December 2023
Approaches To RBM TEN Steps 20th December 2023
January 2023
Assosa
1
Designing and Building a
Results-Based
Monitoring and
Evaluation System:
A Tool for Public Sector Management
1 2 3 4 5 6 7 8 9 10
3
The Power of Measuring Results
5
Reasons to Do Results-Based M&E
6
Reasons to Do Results-Based
M&E (cont.)
• Focuses attention on achieving outcomes
important to the organization and its stakeholders
• Provides timely, frequent information to staff
• Helps establish key goals and objectives
• Permits managers to identify and take action to correct
weaknesses
• Supports a development agenda that is shifting towards
greater accountability for aid lending
• Create Ownership
7
Important…
• It takes leadership commitment to achieve a better-performing
organization
• Plus redeployment of resources to building
monitoring and evaluation systems
• Plus individuals committed to improve public sector
performance
8
Definition
Results-Based Monitoring
(what we will call “monitoring”)
is a continuous process of collecting and analyzing
information to compare how well a project, program
or policy is being implemented against expected
results
9
Results-Based Monitoring
• Intermediate effects of
Outcomes
outputs on clients
• Tasks personnel
Activities undertake to transform
inputs to outputs
13
Evaluation Addresses
“Why” Questions – What caused the changes we
are monitoring
Assessing the
improvement in water Assessing the farming
Project
fee collection rates in 2 practices of resettled
Evaluations
provinces farmers in one province
15
Step One:
Conducting a Readiness
Assessment
Planning for
Conducting Selecting Key Improvement
a Indicators to — Selecting The Role of
Evaluations
Using
Your
Monitor Results Targets
Readiness Outcomes Findings
Assessment
7 8 9 10
11 2 3 4 5 6
Agreeing on Baseline Monitoring Reporting Your Sustaining
Outcomes to Data on for Results Findings the
Monitor and Indicators— M&E System
Evaluate Where Are Within Your
We Today? Organization
16
Why Do a Readiness Assessment?
It will help identify the barriers and obstacles—structural,
cultural, political, or individual in a given organization.
1. To understand what incentives (or lack thereof) exist to
effectively monitor and evaluate development goals?
2. To understand the roles and responsibilities of those
organizations and individuals involved in monitoring
and evaluating government policies, programs, and
projects?
E.g.
– Supreme Audit Office
– Ministry of Finance/ Beaureau of Finance
– Parliament/ reguinal council
– Ministry of Planning/ bureau of planning commission
3. To identify issues related to the capacity ( or lack of) to
monitor and evaluate government programs
17
Capacity
18
Barriers
19
Key Elements of Success
20
Agreeing on
Outcomes to Monitor and
Evaluate
Planning for
Selecting Key Improvement
Conducting a Indicators to — Selecting The Role of Using
Readiness Monitor Results Evaluations Your
Assessment Outcomes Targets Findings
7 8 9 10
1 22 3 4 5 6
Baseline Monitoring for Reporting Your Sustaining
Agreeing on Data on Results Findings the
Outcomes to Indicators— M&E System
Where Are Within Your
Monitor and We Today? Organization
Evaluate
21
Why an Emphasis on Outcomes?
22
Issues to Consider in Choosing
Outcomes to Monitor and
Evaluate
• Are there stated national/sectoral goals?
• Have political promises been made that specify
improved performance
of the government?
• Do citizen polling data indicate specific
concerns?
• Is authorizing legislation present?
• Other? (Millennium Development Goals)
• Is aid lending linked with specific goals?
23
Developing Outcomes for One Policy Area:
Example: Education
1. Nation’s
children have
Improved
access to pre-
school
programs
2. Primary school
learning
outcomes for
children are
improved.
24
In Summary:
Why an Emphasis on
Outcomes?
• Makes explicit the intended objectives of government
action
(“Know where you are going before you get
moving”)
• Outcomes are the results governments hope to achieve
• Clear setting outcomes is key to results- based M&E system
• Note: Budget to outputs, manage to outcomes!
25
Outcomes Summary Continued
26
Selecting Key Performance
Indicators to Monitor
Outcomes
Selecting Key
Indicators to Planning for
Improvement
Conducting a Monitor — Selecting The Role of Using
Results Targets Evaluations Your
Readiness Outcomes Findings
Assessment
7 8 9 10
1 2 33 4 5 6
Agreeing on Baseline Monitoring Reporting Your Sustaining
Outcomes to Data on for Results Findings the
Monitor and Indicators— M&E System
Evaluate Where Are Within Your
We Today? Organization
27
Selecting Key Performance
Indicators to Monitor
Outcomes
• What?An indicator is a key statistical measure selected to help
describe (indicate) a situation concisely, track progress and
performance, and act as a guide to decision making (AIHW, 2008a).
• Why? indicators are standardized measures that allow for
comparisons over time, over different geographic areas and/or across
programmes
• Outcome indictors are not the same as outcomes
• Each outcome needs to be translated into one or more indicators
– An outcome indicator identifies a specific numerical
measurement that tracks progress (or not) toward achieving an
outcome
29
How Many Indicators Are Enough?
30
Outcome:
Reduction in Childhood
Morbidity Exercise
Are the following indicators outcome indicator or not?
Indicators
• % in missed school days due to illness
• % reduction in hospital admission due to illness
• More medical doctors hired
• % change in prevalence of communicable diseases
• Number of children immunized
• % working days missed by parents
• % change in childhood gastrointestinal diseases
31
Developing A Set of Outcome
Indicators for a Policy
Area: Example: Education
Outcomes/Goals Indicators Baselines Targets
1. Nation’s children 1. % of eligible urban
have Improved children enrolled in
access to pre- pre-school education
school programs
2. % of eligible rural
children enrolled in
pre-school education
2. % of Grade 6
students scoring
higher on
standardized math
and science tests in
comparison to
baseline data
32
In Summary:
Developing
Indicators
• You will need to develop your own indicators to meet
your own needs.
• Developing good indicators often takes
more than one try!
• Arriving at the final indicators you will use will take time!
• Pilot, Pilot, Pilot!
33
Baseline Data on Indicators –
Where Are We
Today
Planning for
Selecting Key Improvement
Conducting a Indicators to — Selecting The Role of Using
Readiness Monitor Results Targets Evaluations Your
Assessment Outcomes Findings
1 2 3 4 5 6 7 8 9 10
34
Establishing Baseline Data on Indicators
35
Building Baseline Information
Who will
Data Who will Frequency analyze &
Data Collection collect & Cost to Difficulty report
Indicator Source Method data collect to data
collect
84
84
84
84
36
Data Collection Methods
Panel
Key informant Surveys
interviews
Conversation
with Focus
concerned Group One-Time
individuals Interviews Survey
Participant
Community Observation Direct
Census
Interviews observation
Reviews of
official records
(MIS and admin Field
Field data) experiments
Questionnaires
visits
37
Developing Baseline Data
for One Policy
Area:
Example: Education
38
Planning for Improvement –
Selecting Results
Targets
Planning for
Selecting Key Improvement —
Conducting a Indicators to The Role of Using
Readiness Monitor Selecting Results Evaluations Your
Assessment Outcomes Targets Findings
1 2 3 4 5 6 7 8 9 10
39
Definition
Targets
are the quantifiable levels of the
indicators that a country or
organization wants to achieve at a
given point in time—
For Example,
Agricultural exports will increase by
20% in the next three years over the
baseline
40
Identifying Expected or Desired Level of
Project or Program or Policy Results
Requires Selecting Performance Targets
+ =
Baseline Indicator
Level Improvement Performance
41
Additional Considerations
in Setting Indicator
Targets
• Only one target is desirable for each indicator
• If the indicator is new (not previously used) be
careful on setting firm targets (use a range)
• Most targets are set yearly, but some could be set
quarterly; others set for longer periods (not more than
5 years)
• It takes time to observe the effects of improvements;
therefore, be realistic when setting targets
43
Now We Have
A Results
Framework
Note: This completed matrix becomes your results
framework!
– It defines your goals and gives you a plan for how
you will know if you have been successful (or not)
in achieving these goals
44
Building a Monitoring System
Planning for
Selecting Key Improvement
Conducting a Indicators to — Selecting The Role of Using
Readiness Monitor Results Targets Evaluations Your
Assessment Outcomes Findings
1 2 3 4 5 6 7 8 9 10
45
Key Types of Monitoring
Results Impact
Results Monitoring
Outcome
Output
Implementation
Input
46
Implementation Monitoring Links to
Results Monitoring
Outcome
47
Eg. Linking Implementation
Monitoring to Results Monitoring
Goal
Target 1 Target 2
49
Every Monitoring System Needs:
Ownership
Management
Maintenance
Credibility
50
The Role of Evaluations
Planning for
Selecting Key Improvement The Role of
Conducting a Indicators to — Selecting Using
Readiness Monitor Results Targets Your
Assessment Outcomes Evaluations Findings
1 2 3 4 5 6 7 8 9 10
51
Uses of Evaluation
52
Evaluation Means Information on:
• Whether we are doing the right things
Strategy
– Rationale/justification
– Clear theory of change
– Best practices
53 – Lessons learned
Characteristics of Quality Evaluations
Impartiality Usefulness
Technical Stakeholder
adequacy involvement
Feedback/ Value
dissemination for money
54
When Is It Time
to Make Use of
Evaluation?
Planned
Actual
55
When Is it Time
to Make Use of
Evaluation?
When you want to determine the roles of both
design and implementation on project,
program, or policy outcomes
Strength
Of Design
Hi Lo
Hi
1. 2.
Strength of
Implementation
L
o
3. 4.
56
Reporting Your Findings
Planning for
Selecting Key Improvement
Conducting a Indicators to — Selecting The Role of Using
Readiness Monitor Results Targets Evaluations Your
Assessment Outcomes Findings
1 2 3 4 5 6 7 8 9 10
57
“If You Do Not Measure Results,
You Can Not Tell Success From
Failure”
Analyzing and Reporting Data:
• Gives information on the status of projects,
programs, and policies
• Provides clues to problems
• Creates opportunities to consider improvements in the
(projects, programs, or policy) implementation strategies
• Provides important information over time on trends and
directions
• Helps confirm or challenge theory of change
58
Analyzing Your Results Data
? Access
Access
Time Time
Improving access Improving access
59 to rural markets to rural markets
What Happens If
the Results News Is
Bad?
• A good results measurement system is intended to surface
problems (early warning system)
• Reports on performance should include explanations about
poor outcomes and identify steps taken or planned to
correct problems
• Protect the messenger
61
Using Your Findings
Planning for
Selecting Key Improvement Using
Conducting a Indicators to — Selecting The Role of
Readiness Monitor Results Targets Evaluations Your
Assessment Outcomes Findings
1 2 3 4 5 6 7 8 9 10
62
Using Your Findings
10 Uses of Results Findings
1 Responds to elected officials’ and the public’s
demands for accountability
2 Helps formulate and justify budget requests
3 Helps in making operational resource allocation
decisions
4 Triggers in-depth examinations of what performance
problems exist and what corrections are needed
63
Using Your Findings (cont.)
10 Uses of Results Findings
5 Helps motivate personnel to continue making program
improvements
6 Monitors the performance of contractors and grantees
7 Provides data for special, in-depth program
evaluations
8 Helps provide services more efficiently
9 Supports strategic and other long-term planning efforts (by
providing baseline information and later tracking progress)
10 Communicates better with the public to
build public trust
64
Nine Strategies for Sharing Information
65
Sustaining the M&E
System Within Your
Organization
Planning for
Selecting Key Improvement
Conducting a Indicators to — Selecting The Role of Using
Readiness Monitor Results Targets Evaluations Your
Assessment Outcomes Findings
1 2 3 4 5 6 7 8 9 10
Agreeing on Baseline Monitoring Reporting Your
Outcomes to Data on for Results Findings
Sustaining
Monitor and Indicators— the M&E
Evaluate Where Are System
We Today?
Within Your
Organization
66
6 Critical Components of Sustaining
1. Demand
2. Clear Roles and Responsibilities
3. Trustworthy and Credible Information
4. Accountability
5. Capacity
6. Incentives
67
Critical Components One: Demand
1. Demand
If demand is episodic or haphazard, results-
based M&E systems are not going to be used and
sustained
Structured requirements for reporting results,
including legislation, regulations, and
international development requirements
68
Critical Components Two:
Clear Roles and Responsibilities
69
Critical Components Three:
Trustworthy and Credible Information
70
Critical Components Four: Accountability
4. Accountability
No part of the government should be exempt from
accountability to stakeholders
Civil society organizations & NGOs can play a key
role in encouraging transparency & accountability
The media, private sector, and parliament also
have roles to ensure that the information
produced is timely, accurate, available, and
addresses government performance
It is also important not to reward failure
Accountability means that problems should be
acknowledged and addressed
71
Critical Components Five: Capacity
5. Capacity
Sound technical skills in data collection & analysis
are necessary for the sustainability
Managerial skills in strategic goal setting &
organizational development are also needed
Data collection & retrieval systems must be up &
running & modernized
Governments will need to commit continuing
financial resources to up keep & mgmt RBM&E
Institutional experience and memory are also
helpful in the long term sustainability systems
72
Critical Component Six: Incentives
6. Incentives
Incentives need to be introduced to encourage use
of performance information:
Success is acknowledged and rewarded
Problems are addressed
Messengers are not punished
Organizational learning is valued
Budget savings are shared
Others?
73
Last Reminders!
74
Thank You !!
Foot Bridge: Social Hazard
Foot Bridge: Social Hazard