Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 77

Approaches to Results Based Management:

January 2023
Assosa
1
Designing and Building a
Results-Based
Monitoring and
Evaluation System:
A Tool for Public Sector Management

2 February 18, 2004


Ten Steps to Designing, Building and
Sustaining a Results-Based
Monitoring
and Evaluation System
Planning for
Selecting Key Improvement
Conducting a Indicators to — Selecting The Role of Using
Readiness Monitor Results Targets Evaluations Your
Assessment Outcomes Findings

1 2 3 4 5 6 7 8 9 10

Agreeing on Baseline Monitoring Reporting Your Sustaining


Outcomes to Data on for Results Findings the
Monitor and Indicators— M&E System
Evaluate Where Are Within Your
We Today? Organization

3
The Power of Measuring Results

• If you do not measure results, you can not tell


success from failure
• If you can not see success, you can not
reward it
• If you can not reward success, you are probably
rewarding failure
• If you can not see success, you can not learn
from it
• If you can not recognize failure, you can not
correct it
• If you can demonstrate results, you can win
public support

4 Adapted from Osborne & Gaebler, 1992


Introduction to Results-Based Monitoring
and Evaluation
What Are We Talking About?

• Results-based monitoring and evaluation measures


how well government are performing
• Results-based monitoring and evaluation is a
management tool
• Results-based monitoring and evaluation
emphasizes assessing how outcomes are being
achieved over time

5
Reasons to Do Results-Based M&E

• Provides crucial information about public sector performance


• Provides a view over time on the status of a
project, program, or policy
• Promotes credibility and public confidence by reporting on
the results of programs
• Helps formulate and justify budget requests
• Identifies potentially promising programs or practices

6
Reasons to Do Results-Based
M&E (cont.)
• Focuses attention on achieving outcomes
important to the organization and its stakeholders
• Provides timely, frequent information to staff
• Helps establish key goals and objectives
• Permits managers to identify and take action to correct
weaknesses
• Supports a development agenda that is shifting towards
greater accountability for aid lending
• Create Ownership

7
Important…
• It takes leadership commitment to achieve a better-performing
organization
• Plus redeployment of resources to building
monitoring and evaluation systems
• Plus individuals committed to improve public sector
performance

So…it comes down to a combination of institutional


capacity and political will.

8
Definition

Results-Based Monitoring
(what we will call “monitoring”)
is a continuous process of collecting and analyzing
information to compare how well a project, program
or policy is being implemented against expected
results

9
Results-Based Monitoring

Goal • Long-term, widespread


Results (Impacts) improvement in society

• Intermediate effects of
Outcomes
outputs on clients

• Products and services


Outputs
produced
Implementation

• Tasks personnel
Activities undertake to transform
inputs to outputs

Inputs • Financial, human, and


10 material resources
Binnendijk, 2000
eg. Results-Based Monitoring:
Oral Re-hydration Therapy

Goal • Child mortality and


(Impacts) morbidity reduced
• Improved use of ORT in
Outcomes management of childhood
diarrhea
• Increased maternal knowledge of and
Outputs
access to ORT services

• Media campaigns to educate


Activities mothers, health personnel trained in
ORT, etc.

Inputs • Funds, ORT supplies, etc.


11 Binnendijk, 2000
Exercise: Identify the Sequence of Inputs,
Activities, Outputs and
Outcomes
• Goal: Create economically viable women-owned micro-
enterprises
– Government makes available funds for
micro-enterprise loans
– Government approves 61 applications from
program graduates
– 90% of successful applicants begin operating
new
businesses after government approves
application
– 15 qualified course trainers available
– 72 women complete training
– Income of graduates increases 25% in first year
after course completion
– 100 women attend training in micro-enterprise business
management
12
Definition

Results-Based Evaluation is an assessment of a planned,


ongoing, or completed intervention to determine its
 relevance,
 efficiency,
 effectiveness,
 impact and
 sustainability.
The intent is to incorporate lessons learned into the decision-
making process.

13
Evaluation Addresses
“Why” Questions – What caused the changes we
are monitoring

“How” Questions – What was the sequence or


processes that led to successful
(or not) outcomes

“Compliance/ – Did the promised activities


Accountability actually take place and as
Questions” they were planned?

Process/ Was the implementation


Implementation process followed as
Questions anticipated, and with what
14 consequences
Some Examples of Evaluation

Privatizing Water Resettlement


Systems

Comparing model Comparing strategies


Policy approaches to used for resettlement of
Evaluations privatizing public water rural villages to new
supplies areas

Assessing fiscal Assessing the degree to


Program which resettled village
management of
Evaluations farmers maintain previous
government systems
livelihood

Assessing the
improvement in water Assessing the farming
Project
fee collection rates in 2 practices of resettled
Evaluations
provinces farmers in one province

15
Step One:
Conducting a Readiness
Assessment

Planning for
Conducting Selecting Key Improvement
a Indicators to — Selecting The Role of
Evaluations
Using
Your
Monitor Results Targets
Readiness Outcomes Findings
Assessment

7 8 9 10
11 2 3 4 5 6
Agreeing on Baseline Monitoring Reporting Your Sustaining
Outcomes to Data on for Results Findings the
Monitor and Indicators— M&E System
Evaluate Where Are Within Your
We Today? Organization

16
Why Do a Readiness Assessment?
It will help identify the barriers and obstacles—structural,
cultural, political, or individual in a given organization.
1. To understand what incentives (or lack thereof) exist to
effectively monitor and evaluate development goals?
2. To understand the roles and responsibilities of those
organizations and individuals involved in monitoring
and evaluating government policies, programs, and
projects?
E.g.
– Supreme Audit Office
– Ministry of Finance/ Beaureau of Finance
– Parliament/ reguinal council
– Ministry of Planning/ bureau of planning commission
3. To identify issues related to the capacity ( or lack of) to
monitor and evaluate government programs
17
Capacity

• Assess current capacity to monitor and


evaluate:
– Technical skills
– Managerial skills
– Existing data systems and their quality
– Technology available
– Fiscal resources available
– Institutional experience

18
Barriers

• Do any of these immediate barriers now exist to getting


started in building an M&E system?
– Lack of fiscal resources
– Lack of political will
– Lack of champion
– Lack of expertise & knowledge
– Lack of strategy
– Lack of prior experience

19
Key Elements of Success

• Assess the Country’s Capacity Against the Following:


– Does a clear mandate exist for M&E?
• Law? Civil Society? Other?
– Is there the presence of strong leadership at the most senior level of
the government?
– Are resource and policy decisions linked to the
budget?
– How reliable is information that may be used for policy and
management decision making?
– How involved is civil society as a partner with government, or
voice with government?
– Are there pockets of innovation that can serve as
beginning practices or pilot programs?

20
Agreeing on
Outcomes to Monitor and
Evaluate

Planning for
Selecting Key Improvement
Conducting a Indicators to — Selecting The Role of Using
Readiness Monitor Results Evaluations Your
Assessment Outcomes Targets Findings

7 8 9 10
1 22 3 4 5 6
Baseline Monitoring for Reporting Your Sustaining
Agreeing on Data on Results Findings the
Outcomes to Indicators— M&E System
Where Are Within Your
Monitor and We Today? Organization
Evaluate

21
Why an Emphasis on Outcomes?

• Makes explicit the intended objectives of


government action
(“Know where you are going
before you get moving”)
• Outcomes are what produce
benefits
• They tell you when you have been
successful or not

22
Issues to Consider in Choosing
Outcomes to Monitor and
Evaluate
• Are there stated national/sectoral goals?
• Have political promises been made that specify
improved performance
of the government?
• Do citizen polling data indicate specific
concerns?
• Is authorizing legislation present?
• Other? (Millennium Development Goals)
• Is aid lending linked with specific goals?

23
Developing Outcomes for One Policy Area:
Example: Education

Outcomes Indicators Baselines Targets

1. Nation’s
children have
Improved
access to pre-
school
programs

2. Primary school
learning
outcomes for
children are
improved.

24
In Summary:
Why an Emphasis on
Outcomes?
• Makes explicit the intended objectives of government
action
(“Know where you are going before you get
moving”)
• Outcomes are the results governments hope to achieve
• Clear setting outcomes is key to results- based M&E system
• Note: Budget to outputs, manage to outcomes!

25
Outcomes Summary Continued

Outcomes are usually not directly


measured—only reported on

Outcomes must be translated to a set


of key indicators

26
Selecting Key Performance
Indicators to Monitor
Outcomes
Selecting Key
Indicators to Planning for
Improvement
Conducting a Monitor — Selecting The Role of Using
Results Targets Evaluations Your
Readiness Outcomes Findings
Assessment

7 8 9 10
1 2 33 4 5 6
Agreeing on Baseline Monitoring Reporting Your Sustaining
Outcomes to Data on for Results Findings the
Monitor and Indicators— M&E System
Evaluate Where Are Within Your
We Today? Organization

27
Selecting Key Performance
Indicators to Monitor
Outcomes
• What?An indicator is a key statistical measure selected to help
describe (indicate) a situation concisely, track progress and
performance, and act as a guide to decision making (AIHW, 2008a).
• Why? indicators are standardized measures that allow for
comparisons over time, over different geographic areas and/or across
programmes
• Outcome indictors are not the same as outcomes
• Each outcome needs to be translated into one or more indicators
– An outcome indicator identifies a specific numerical
measurement that tracks progress (or not) toward achieving an
outcome

28 Urban Institute 1999


An Outcome Indicator

Answers the question:

“How will we know


achievement when we see it?”

29
How Many Indicators Are Enough?

The minimum number that answers the question:

“Has the outcome been achieved?”

30
Outcome:
Reduction in Childhood
Morbidity Exercise
Are the following indicators outcome indicator or not?

Indicators
• % in missed school days due to illness
• % reduction in hospital admission due to illness
• More medical doctors hired
• % change in prevalence of communicable diseases
• Number of children immunized
• % working days missed by parents
• % change in childhood gastrointestinal diseases

31
Developing A Set of Outcome
Indicators for a Policy
Area: Example: Education
Outcomes/Goals Indicators Baselines Targets
1. Nation’s children 1. % of eligible urban
have Improved children enrolled in
access to pre- pre-school education
school programs
2. % of eligible rural
children enrolled in
pre-school education

2. Primary school 1. % of Grade 6 students


learning outcomes scoring 70% or better
for children are on standardized math
improved and science tests

2. % of Grade 6
students scoring
higher on
standardized math
and science tests in
comparison to
baseline data

32
In Summary:
Developing
Indicators
• You will need to develop your own indicators to meet
your own needs.
• Developing good indicators often takes
more than one try!
• Arriving at the final indicators you will use will take time!
• Pilot, Pilot, Pilot!

33
Baseline Data on Indicators –
Where Are We
Today

Planning for
Selecting Key Improvement
Conducting a Indicators to — Selecting The Role of Using
Readiness Monitor Results Targets Evaluations Your
Assessment Outcomes Findings

1 2 3 4 5 6 7 8 9 10

Agreeing on Baseline Monitoring Reporting Your Sustaining


Outcomes to for Results Findings the
Monitor and Data on M&E System
Evaluate Indicators— Within Your
Organization
Where Are
We Today?

34
Establishing Baseline Data on Indicators

A performance baseline is…


• Information (quantitative or qualitative)
that provides data at the beginning of, or
just prior to, the monitoring period. The
baseline is used to:
– Learn about recent levels and
patterns of performance on the
indicator; and to
– Gauge subsequent policy, program, or
project performance

35
Building Baseline Information

Who will
Data Who will Frequency analyze &
Data Collection collect & Cost to Difficulty report
Indicator Source Method data collect to data
collect

84
84
84
84

36
Data Collection Methods

Panel
Key informant Surveys
interviews
Conversation
with Focus
concerned Group One-Time
individuals Interviews Survey
Participant
Community Observation Direct
Census
Interviews observation
Reviews of
official records
(MIS and admin Field
Field data) experiments
Questionnaires
visits

Informal/Less Structured Methods More Structured/Formal Methods

37
Developing Baseline Data
for One Policy
Area:
Example: Education

Outcomes/Goals Indicators Baselines Targets


1. Nation’s children have 1. % of eligible urban 1. 75% urban in 1999
Improved access to pre- children enrolled in
school programs pre-school education

2. % of eligible rural 2. 40% rural in 2000


children enrolled in
pre-school education

2. Primary school learning 1. % of Grade 6 students 1. 75% in 2002 scored 70% or


outcomes for children are scoring 70% or better better in math. 61% in 2002
improved on standardized math scored 70% or better in
and science tests science

2. % of Grade 6 students 2. Mean % score in 2002 for


scoring 70% or better Grade 6 students for math
on standardized math was 68%, and 53% for
and science tests in science
comparison to baseline
data

38
Planning for Improvement –
Selecting Results
Targets
Planning for
Selecting Key Improvement —
Conducting a Indicators to The Role of Using
Readiness Monitor Selecting Results Evaluations Your
Assessment Outcomes Targets Findings

1 2 3 4 5 6 7 8 9 10

Agreeing on Baseline Monitoring Reporting Your Sustaining


Outcomes to Data on for Results Findings the
Monitor and Indicators— M&E System
Evaluate Where Are Within Your
We Today? Organization

39
Definition

Targets
are the quantifiable levels of the
indicators that a country or
organization wants to achieve at a
given point in time—
For Example,
Agricultural exports will increase by
20% in the next three years over the
baseline

40
Identifying Expected or Desired Level of
Project or Program or Policy Results
Requires Selecting Performance Targets

Desired Level of Target

+ =
Baseline Indicator
Level Improvement Performance

Assumes a finite Desired level of


and expected level performance to be
of inputs, activities, reached within a
and outputs specific time

41
Additional Considerations
in Setting Indicator
Targets
• Only one target is desirable for each indicator
• If the indicator is new (not previously used) be
careful on setting firm targets (use a range)
• Most targets are set yearly, but some could be set
quarterly; others set for longer periods (not more than
5 years)
• It takes time to observe the effects of improvements;
therefore, be realistic when setting targets

42 Adapted from the Urban Institute, 1999


Developing Targets
for One Policy
Area:
A PRSP Example: Education
Outcomes/Goals Indicators Baselines Targets
1. Nation’s children 1 % of eligible urban 1. 75% urban in 1999 1. 85% urban by 2006
have improved . children enrolled in
access to pre- pre-school
school programs education

2 % of eligible rural 2. 40% rural in 2000 2. 60% by 2006


. children enrolled in
pre-school
education
2. Primary school 1 % of Grade 6 1. 75% in 2002 scored 1. 80% by 2006 in math
learning outcomes . students scoring 70% or better in math.
for children are 70% or better on 61% in 2002 scored 67% by 2006 in science
improved standardized math 70% or better in science
and science tests

2 % of Grade 6 2. Mean % score in 2002 for 2. Mean math test score in


. students scoring Grade 6 students for 2006 is 78%.
70% or better on math was 68%, and 53%
standardized math for science
and science tests in Mean science test score in
comparison to
baseline data 2006 is 65%.

43
Now We Have
A Results
Framework
Note: This completed matrix becomes your results
framework!
– It defines your goals and gives you a plan for how
you will know if you have been successful (or not)
in achieving these goals

44
Building a Monitoring System

Planning for
Selecting Key Improvement
Conducting a Indicators to — Selecting The Role of Using
Readiness Monitor Results Targets Evaluations Your
Assessment Outcomes Findings

1 2 3 4 5 6 7 8 9 10

Agreeing on Baseline Reporting Your Sustaining


Outcomes to Data on
Monitoring Findings the
Monitor and Indicators— for M&E System
Evaluate Where Are Results Within Your
We Today? Organization

45
Key Types of Monitoring

Results Impact
Results Monitoring
Outcome

Output
Implementation

Activity Implementation Monitoring


(Means and Strategies)

Input
46
Implementation Monitoring Links to
Results Monitoring

Outcome

Target Target Target


1 2 3

Means and Means and Means and


Strategies Strategies Strategies
(Multi-Year (Multi-Year (Multi-Year
and Annual and Annual and Annual
Work Plans) Work Plans) Work Plans)

47
Eg. Linking Implementation
Monitoring to Results Monitoring

Children’s mortality reduced


Goal

Outcome Children’s morbidity reduced

Reduce incidence of childhood


Target gastrointestinal disease by 20%
over 3 years

• Improve cholera prevention programs


Means and • provision of vitamin A supplements
48 • use of oral re-hydration therapy
Strategies
Achieving Results Through Partnership

Goal

Outcome Outcome Outcome

Target 1 Target 2

Means & Strategy Means & Strategy Means & Strategy

Partner 2 Partner 2 Partner 2

Partner 1 Partner 3 Partner 1 Partner 3 Partner 1 Partner 3

49
Every Monitoring System Needs:

Ownership

Management

Maintenance

Credibility

50
The Role of Evaluations

Planning for
Selecting Key Improvement The Role of
Conducting a Indicators to — Selecting Using
Readiness Monitor Results Targets Your
Assessment Outcomes Evaluations Findings

1 2 3 4 5 6 7 8 9 10

Agreeing on Baseline Monitoring Reporting Your Sustaining


Outcomes to Data on For Results Findings the
Monitor and Indicators— M&E System
Evaluate Where Are Within Your
We Today? Organization

51
Uses of Evaluation

• To make resource decisions


• To re-think the causes of a problem
• To identify issues around an emerging problem,
i.e. children dropping out of school
• Decision-making on best alternatives
• Support of public sector reform / innovation
• To help build consensus among
stakeholders on how to respond to a
problem

52
Evaluation Means Information on:
• Whether we are doing the right things
Strategy
– Rationale/justification
– Clear theory of change

• Whether we are doing things right


– Effectiveness in achieving expected outcomes
Operation

– Efficiency in optimizing resources


– Client satisfaction

• Whether there are better ways of doing it


– Alternatives
Learning

– Best practices
53 – Lessons learned
Characteristics of Quality Evaluations

Impartiality Usefulness

Technical Stakeholder
adequacy involvement

Feedback/ Value
dissemination for money

54
When Is It Time
to Make Use of
Evaluation?

When regular results measurement suggests


actual performance diverges sharply from
planned performance

Planned
Actual
55
When Is it Time
to Make Use of
Evaluation?
When you want to determine the roles of both
design and implementation on project,
program, or policy outcomes

Strength
Of Design
Hi Lo

Hi
1. 2.
Strength of
Implementation

L
o
3. 4.

56
Reporting Your Findings

Planning for
Selecting Key Improvement
Conducting a Indicators to — Selecting The Role of Using
Readiness Monitor Results Targets Evaluations Your
Assessment Outcomes Findings

1 2 3 4 5 6 7 8 9 10

Agreeing on Baseline Monitoring Reporting Sustaining


Outcomes to Data on for Results the
Monitor and Indicators— Your M&E System
Evaluate Where Are
We Today?
Findings Within Your
Organization

57
“If You Do Not Measure Results,
You Can Not Tell Success From
Failure”
Analyzing and Reporting Data:
• Gives information on the status of projects,
programs, and policies
• Provides clues to problems
• Creates opportunities to consider improvements in the
(projects, programs, or policy) implementation strategies
• Provides important information over time on trends and
directions
• Helps confirm or challenge theory of change

58
Analyzing Your Results Data

• Examine changes over time


– Compare present to past data to look for trends and
other changes
– The more data points you have, the more certain you
are of your trends

? Access
Access

Time Time
Improving access Improving access
59 to rural markets to rural markets
What Happens If
the Results News Is
Bad?
• A good results measurement system is intended to surface
problems (early warning system)
• Reports on performance should include explanations about
poor outcomes and identify steps taken or planned to
correct problems
• Protect the messenger

60 Adapted from The Urban Institute, 1999


Outcomes Reporting Format
Actual Outcomes Versus Targets
Baseline Current Target Difference
Outcome Indicator
(%) (%) (%) (%)

Rates of hepatitis (N=6000) 30 25 20 -5

Percentage of children with improved 20 20 24 -4


overall health status (N=9000)

Percentage of children who show 4


out of 5 positive scores on 50 65 65 0
physical exams (N=3500)
Percentage of children with improved
nutritional status
(N = 14,000) 80 85 83 +2

Source: Made-up data, 2003

61
Using Your Findings

Planning for
Selecting Key Improvement Using
Conducting a Indicators to — Selecting The Role of
Readiness Monitor Results Targets Evaluations Your
Assessment Outcomes Findings

1 2 3 4 5 6 7 8 9 10

Agreeing on Baseline Monitoring Reporting Your Sustaining


Outcomes to Data on for Results Findings the
Monitor and Indicators— M&E System
Evaluate Where Are Within Your
We Today? Organization

62
Using Your Findings
10 Uses of Results Findings
1 Responds to elected officials’ and the public’s
demands for accountability
2 Helps formulate and justify budget requests
3 Helps in making operational resource allocation
decisions
4 Triggers in-depth examinations of what performance
problems exist and what corrections are needed

63
Using Your Findings (cont.)
10 Uses of Results Findings
5 Helps motivate personnel to continue making program
improvements
6 Monitors the performance of contractors and grantees
7 Provides data for special, in-depth program
evaluations
8 Helps provide services more efficiently
9 Supports strategic and other long-term planning efforts (by
providing baseline information and later tracking progress)
10 Communicates better with the public to
build public trust

64
Nine Strategies for Sharing Information

• Empower the Media


• Enact “Freedom of Information” legislation
• Institute E-government
• Add information on internal and external internet sites
• Publish annual budget reports
• Engage civil society and citizen groups
• Strengthen parliamentary oversight
• Strengthen the Office of the Auditor General
• Share and compare results findings with development partners

65
Sustaining the M&E
System Within Your
Organization
Planning for
Selecting Key Improvement
Conducting a Indicators to — Selecting The Role of Using
Readiness Monitor Results Targets Evaluations Your
Assessment Outcomes Findings

1 2 3 4 5 6 7 8 9 10
Agreeing on Baseline Monitoring Reporting Your
Outcomes to Data on for Results Findings
Sustaining
Monitor and Indicators— the M&E
Evaluate Where Are System
We Today?
Within Your
Organization

66
6 Critical Components of Sustaining

Monitoring & Evaluation


Systems

1. Demand
2. Clear Roles and Responsibilities
3. Trustworthy and Credible Information
4. Accountability
5. Capacity
6. Incentives
67
Critical Components One: Demand

1. Demand
 If demand is episodic or haphazard, results-
based M&E systems are not going to be used and
sustained
 Structured requirements for reporting results,
including legislation, regulations, and
international development requirements

68
Critical Components Two:
Clear Roles and Responsibilities

2. Clear Roles and Responsibilities


 Clear roles and responsibilities and formal
organizational and political lines of authority must
be established.
 The organization and people who will be in charge
of collecting, analysing, and reporting performance
information must be clearly defined.
 Guidance is necessary
 Internal political coordination is key

69
Critical Components Three:
Trustworthy and Credible Information

3. Trustworthy and Credible Information


 The M&E system must be able to produce results
information that brings both good and bad news
 Performance information should be transparent
and made available to all key stakeholders
 It should also be noted that the producers of
results information need protection from political
reprisals

70
Critical Components Four: Accountability
4. Accountability
 No part of the government should be exempt from
accountability to stakeholders
 Civil society organizations & NGOs can play a key
role in encouraging transparency & accountability
 The media, private sector, and parliament also
have roles to ensure that the information
produced is timely, accurate, available, and
addresses government performance
 It is also important not to reward failure
 Accountability means that problems should be
acknowledged and addressed
71
Critical Components Five: Capacity

5. Capacity
 Sound technical skills in data collection & analysis
are necessary for the sustainability
 Managerial skills in strategic goal setting &
organizational development are also needed
 Data collection & retrieval systems must be up &
running & modernized
 Governments will need to commit continuing
financial resources to up keep & mgmt RBM&E
 Institutional experience and memory are also
helpful in the long term sustainability systems

72
Critical Component Six: Incentives

6. Incentives
 Incentives need to be introduced to encourage use
of performance information:
 Success is acknowledged and rewarded
 Problems are addressed
 Messengers are not punished
 Organizational learning is valued
 Budget savings are shared
 Others?

73
Last Reminders!

• The demand for capacity building never ends


• Keep your champions on your side and help them
• Establish the understanding with the Ministry of Finance
and the Parliament that an M&E system needs sustained
resources.
• Look for every opportunity to link results information to
budget and resource allocation decisions.
• Begin with pilot efforts to demonstrate effective results-
based monitoring: Begin with an enclave strategy (e.g.
islands of innovation) as opposed to a whole-of-
government approach.
• Monitor both implementation progress and results
achievements.
• Complement performance monitoring with evaluations to
ensure better understanding of public sector results.

74
Thank You !!
Foot Bridge: Social Hazard
Foot Bridge: Social Hazard

You might also like