Download as pdf or txt
Download as pdf or txt
You are on page 1of 42

Impact evaluation ‘elective’ course

PADM7239A

Lesson 1
Contextualising impact evaluation

Kennedy Manduna, PhD


▪ Presentation outline

1. Some important facts about IE in Africa


2. Key Objectives of Impact Evaluation
3. Two important evaluation questions IE answers.
4. Intractable questions for IE contextualisation
5. The problem of defining monitoring and evaluation
▪ The contextual words
▪ The keywords

6. Towards a framework for institutionalising monitoring


and evaluation
▪ The what?
▪ The how?

7. Components and processes of or in monitoring and


evaluation

2
Some important facts about IE in
Africa
• Challenge of the limited technical
capacity of researchers in
accessing, appraising, interpreting,
synthesizing and, utilizing research
evidence in decision-making.
• We have personnel we who know either
theory or practice- we want a
delicate dance/balance between the
two.

3
Key Objectives of Impact Evaluation

• To answer the cause-effect question- effectiveness


• To answer a question regarding the change brought
about by the implementation What is the impact
causal effect of a project/program on a selected
outcome?
• Implemented to determine the changes which are
directly attributable to the project/program
implementation
• The focus on the change directly attributable to the
implementation as well as the causal relationship
between implementation and change is the hallmark of
IE

4
Two important evaluation questions IE
answers

• It can be used to estimate the


effectiveness of an intervention as
compared to non-intervention “What is the
effectiveness of an intervention when
compared to the absence of an intervention”
• IE can also be used to estimate the best
intervention among several other
interventions by answering the following
question “What is the best intervention
among several interventions”

5
Intractable questions for IE
contextualisation

• The politics of (impact) evaluation: i.e., the


process of conducting and reporting findings.
• The political economy questions: how/where/why IE
intersects with politics and economy?
• Public policy questions: the contemporary complex,
interconnected, wicked and sticky public/social
problems PP generate, and what is the influence of
IE’s in either addressing/worsening them?
• Political questions: IE is not done in a political
vacuum. How then do you navigate the contested and
crocodile-infested political environment?
• IE evidence questions: Whose evidence matters? To
who? Why?--- the politics of evidence.
• Development questions:

6
▪ Bakewell (2003)
▪ Monitoring is the systematic and
continuous assessment of the progress of a
piece of work over time
▪ Checks that things are 'going to plan' and
enables informed adjustments to be made
▪ Integral part of management
▪ Concerned with goals, objectives, outputs,
activities, and inputs of the projects

The problem of defining


8 monitoring and evaluation
▪ OECD (2002) in Kusek and Rist (2004)
▪ A continuous systematic collection of data
on specified indicators
▪ Provides management and main stakeholders
of an on-going development intervention
with
▪ Progress and achievement of objectives
▪ Progress in the use of allocated funds

The problem of defining


9 monitoring and evaluation
▪ The Presidency (2007)
▪ Monitoring is an effective management tool
▪ Monitoring provides managers, decision
makers, and other stakeholders with
regular feedback on progress in
implementation and results as well as
early indication of problems that need to
be corrected
▪ Monitoring reveals actual performance
against what was planned or expected
▪ Monitoring involves collecting, analysing,
and reporting data on inputs, activities,
outputs, outcomes and impacts as well as
external factors

The problem of defining


10 monitoring and evaluation
▪ Bakewell (2003)
▪ Periodic assessment of the relevance,
performance, efficiency, and impact of an
intervention with respect to its stated
objectives
▪ Usually carried out at some significant
stage in the intervention
▪ At the end of a planning period
▪ As the intervention moves to a new phase
▪ In response to a particular critical
issue

The problem of defining


11 monitoring and evaluation
▪ OECD (2002) in Kusek and Rist (2004)
▪ Systematic and objective assessment of an on-going
or completed intervention, for
▪ Design/formative
▪ Implementation/process
▪ Results/outcomes/summative
▪ To determine relevance and fulfilment of
objectives, development efficiency, effectiveness,
and sustainability
▪ Should provide information that is credible and
useful, enabling the incorporation of lessons
learned into the decision-making process of both
recipients and funders

The problem of defining


12 monitoring and evaluation
▪ The Presidency (2007)
▪ Evaluation is a time-bound and periodic exercise
that seeks to provide credible and useful
information to answer specific questions to guide
decision-making by staff, managers and policy
makers
▪ Evaluations may assess relevance, efficiency,
effectiveness, impact, and sustainability
▪ Impact evaluations examine whether underlying
theories and assumptions were valid, what worked,
what did not and why
▪ Evaluation can also be used to extract cross-
cutting lessons from operating unit experiences and
determining the need for modifications to strategic
results frameworks

The problem of defining


13 monitoring and evaluation
▪ We seem to understand the definition of
monitoring and evaluation but until you
reflect on each of the words in the
definitions then you note several
‘contextual words’ and ‘key words’
Contextual terms Key terms
Development Design Impact Activities External factors
Intervention Formative Effectiveness Goals Inputs Risks
Strategy Implementation Relevance Aims Indicators Data
Plan Management Sustainability Outcomes Baselines Information
Policy Process Efficiency Objectives Targets Assessment
Programme Summative Stakeholders Purposes Results Measurement
Project Performance Decision-makers Outputs Assumptions Progress
The problem of defining
14 monitoring and evaluation
▪ Presentation outline

1.The problem of defining monitoring and


evaluation
▪ The contextual words
▪ The key words

2.Towards a framework for institutionalising


monitoring and evaluation
▪ The what?
▪ The how?

3.Components and processes of or in


monitoring and evaluation

Contextualising impact
16 evaluation
▪ The what?

▪ …is actually development, so then;

1. What is development?

2. What is the purpose of development studies?

3. What are the established facts in development


studies?

4. What are the issues and debates in development


studies?

5. What are the components of development?

6. What are the development processes?

Towards a framework for


institutionalising M&E – the
17
▪ What are the components of development

▪ Cultural development
▪ Religion, arts

▪ Political development
▪ Legal and institutional structures, governance,
participation in the electoral process

▪ Economic development
▪ Infrastructural, technology, industrialisation

▪ Social development
▪ Health, education

▪ Environmental development

Towards a framework for


institutionalising M&E – the
18
▪ Development in context … (UNECA 2014 Page
21)

Towards a framework for


institutionalising M&E – the
19
▪ What are the development processes?

▪ Immanent [unintentional]
▪ Street vending
▪ Informal economy

▪ Imminent [intentional]
▪ Development interventions that have
emerged since the Second Word War

Towards a framework for


institutionalising M&E – the
20
▪ Development in context …

Development

Cultural Political Economic Social Environmental

Imminent (Y1) [Development interventions] Immanent (Y0)

Towards a framework for


institutionalising M&E – the
21
▪ Presentation outline

1.The problem of defining monitoring and


evaluation
▪ The contextual words
▪ The key words

2.Towards a framework for institutionalising


monitoring and evaluation
▪ The what?
▪ The how?

3.Components and processes of or in


monitoring and evaluation

Contextualising impact
23 evaluation
▪ The how?

▪ … introduces us to development
interventions

Towards a framework for


24 institutionalising M&E – the how
▪ The how?

▪ However to understand development interventions,


one needs to understand public policy

1. What is public policy?

2. What is the purpose of public policy studies?

3. What are the established facts in public policy


studies?

4. What are the issues and debates in public policy


studies?

5. What are the components of public policy?

6. What are the public policy processes?

Towards a framework for


25 institutionalising M&E – the how
▪ Public policy in context …

Towards a framework for


26 institutionalising M&E – the how
▪ What are the components of public policy?

▪ Of interest is Lasswell’s public policy


cycle or framework

1.Intelligence
2.Promotion
3.Prescription
4.Invocation
5.Application
6.Termination
7.Appraisal

Towards a framework for


27 institutionalising M&E – the how
▪ What are the components of public policy?

▪ More recently summarised into four steps

1.Agenda setting
2.Formulation
3.Implementation
▪ Management
▪ Monitoring
4.Evaluation

Towards a framework for


28 institutionalising M&E – the how
▪ … and finally our entire discussion
framework …

29
▪ … and more spefically
Socio-economic problem Solution

Evaluation 70%

Implementation 80% - Outcome

Formulation 70% - Management - Impact

Diagnostic 70% - Objective analysis - Monitoring

- Setting analysis - Alternative analysis Implementation 10%

- Needs assessment - Results chain & Framework Formulation 10% Diagnostic 20%

- Problem analysis Diagnostic 15% Evaluation 10%

Evaluation 15% Implementation 15%

Formulation 15%

Formative evaluation

Process evaluation

Summative evaluation

Desired
Current position
position

Timeline

Planning stage Implementing stage Stocktaking stage

Towards a framework for


30 institutionalising M&E – the how
▪ Presentation outline

1.The problem of defining monitoring and


evaluation
▪ The contextual words
▪ The key words

2.Towards a framework for institutionalising


monitoring and evaluation
▪ The what?
▪ The how?

3.Components and processes of or in


monitoring and evaluation

Contextualising impact
32 evaluation
▪ Components and process of monitoring and
evaluation

Towards a framework for


33 institutionalising M&E – the how
34
▪ Components and process of monitoring and
evaluation

Towards a framework for


35 institutionalising M&E – the how
▪ The impact and outcomes of monitoring and
evaluation

Towards a framework for


37 institutionalising M&E – the how
▪ So what should training institutions be
teaching to produce effective monitoring and
Module Rationale
evaluation
1 Systems thinking personnel
By its very nature, monitoring and evaluation is logical and therefore requires a
module that should capture this aspect especially to cater for formative
evaluation, theory of change, results chain and framework.

2 Introduction to The focus here should be cultural, political, economic, social, and
development environmental interventions rather than thick debates in development. This
interventions module should provide for attributes and variables for measuring these
interventions.
3 Introduction to With a bias towards the public policy cycle framework so that the how
public policy development interventions should be operationalised is appreciated.

4 Strategic planning This should cater for a detailed understanding of the planning process especially
and management diagnostics and formulation of development interventions.

5 Programme and This should cater for a detailed understanding of operationalising development
project management interventions.

6 Operations This should cater for a detailed understanding of implementing development


management and interventions and, more importantly, monitoring data management: collection
monitoring and storage, processing and analysis, reporting and integration into management
decision-making.
7 Applied research and This should cater for qualitative and quantitative skills set needed for formative,
evaluation process, and summative evaluation. This module should strike a balance
between rigorous research approaches and relevant research even though
research and evaluation are different.
8 Monitoring and This should put everything together and should involve skills for developing and
evaluation practice managing an implementation framework and plan for specific development
interventions.
Notes: Patton (2008) in Porter and Goldman (2013) has pointed out that evaluation is different from research
only because it supports developmental efforts through provision of practical and specific answers to its
40 challenges.
▪ Presentation outline

1.The problem of defining monitoring and


evaluation
▪ The contextual words
▪ The key words

2.Towards a framework for institutionalising


monitoring and evaluation
▪ The what?
▪ The how?

3.Components and processes of or in


monitoring and evaluation

Contextualising impact
41 evaluation
▪ Lesson 01 Recording

▪ https://witscloud-
my.sharepoint.com/:v:/g/personal/a0006907_
wits_ac_za/ETXTJ5sIMRFLvfBWXkmudjcB94t17po
7tE92AveHcRBGAw

Contextualising impact
42 evaluation

You might also like