Professional Documents
Culture Documents
Research Evidence in The Humanitarian Sector
Research Evidence in The Humanitarian Sector
CITATION FUNDING
This document should be cited as: Funding for this document was provided
Blanchet K, Allen C, Breckon J, Davies P, by the UK Science & Innovation Network
Duclos D, Jansen J, Mthiyane H, Clarke M. (represented by SIN Switzerland, British
(2018) Using Research Evidence in the Embassy Berne).
Humanitarian Sector: A practice guide.
London, UK: Evidence Aid, London School CORRESPONDENCE
of Hygiene and Tropical Medicine and Correspondence about this document
Nesta (Alliance for Useful Evidence). should be sent to Karl Blanchet, Director
of the Health in Humanitarian Crises
AUTHORS Centre, London School of Hygiene
This document was written by Karl & Tropical Medicine, Tavistock Place,
Blancheta, Claire Allenb, Jonathan London, UK; and Mike Clarke, Research
Breckonc, Phil Daviesb, Diane Duclosb, Director of Evidence Aid, Centre for Public
Jeroen Jansenb, Helen Mthiyanec and Health, Queen's University Belfast, ICS
Mike Clarkeb (a Health in Humanitarian Block A, Royal Hospitals, Belfast, UK.
Crisis Centre, London School of Hygiene Karl: Karl.Blanchet@lshtm.ac.uk
and Tropical Medicine; b Evidence Aid; Mike: mclarke@qub.ac.uk
c Nesta, Alliance for Useful Evidence).
CONTENTS
Introduction
Evidence coming from research and evaluation can help
you understand what works, where, why and for whom.
It can also tell you what does not work, and help you
avoid repeating the failures of others by learning from
evaluations of unsuccessful humanitarian programmes.
Evidence can also guide the design of the most effective
ways to deliver specific interventions.
We have created this guide to help you HOW TO USE THIS GUIDE
make best use of research evidence when
you are in a humanitarian emergency The guide is divided into four main
or when you are planning for the next sections:
emergency. Our intention is to help you
find and use evidence on interventions,
actions and strategies that might SECTION A
help you make informed choices and What is evidence-informed
decisions. This guide is not about how decision making, and why focus
to generate more research evidence. It on research?
is about using and understanding what
evidence exists and recognising when This section discusses what we
mean by evidence-informed
good evidence is lacking. It should help
decision making, and why research
you build your confidence in compiling, is an essential element of it.
assimilating, distilling, and interpreting a
strong evidence base of existing research,
and think about how you might go
on to evaluate your own projects and SECTION B
commission research or evaluation. When can evidence help you?
This section explores different
scenarios in which using evidence
WHO MIGHT USE THIS can help you, as well as the types
GUIDE? of evidence you might need at
different stages of developing or
This practice guide is primarily implementing a new intervention
aimed at humanitarian decision or policy.
makers and practitioners working
in the field or in the headquarters
of donor, international, national, or
non-governmental organisations. SECTION C
It will help with decisions about What evidence should
the financing, supervision, delivery you choose?
or evaluation of humanitarian This section looks at different types
interventions. It is not aimed at of evidence and examines how to
trained evaluators and researchers, choose the most appropriate for
but instead seeks to foster demand your case. It also discusses how to
for research evidence from wider judge the quality of evidence.
audiences in the humanitarian
sector.
SECTION D
Where should you look
for evidence?
This section offers advice and
resources to help you find the right
evidence to support your case.
What is evidence-informed
decision making, and why
focus on research?
This section discusses what we mean by evidence-
informed decision making, and why research is an
essential element of it.
To begin, let us be clear about what of information and recognises the
we do not mean. We are not talking importance of evidence on issues such
about making decisions and choices as feasibility, preference and culture.
by slavishly following rigid research In a field such as the humanitarian
conclusions. Professional judgement sector where more and better evidence
and other sources of information – such is required,6 we could assume that any
as feedback from your stakeholders model of good decision making should
– will always be important. This be wary of relying solely on professional
practice guide is not about replacing judgement that is not supported by
professional judgement but increasing scientific evidence. Later in this section,
evidence use in humanitarian action. you will read about how we can all be
A good start in defining what we mean ‘predictably irrational’ and – consciously or
is borrowed from medicine. More than unconsciously – make errors in important
two decades ago, David Sackett and judgements. We explore how to mitigate
his colleagues proposed the following these errors of judgement in subsequent
definition that has stood the test of time: sections. However, other decision making
models have also stressed the importance
“Evidence-based medicine is the of blending knowledge of evidence with
conscientious, explicit and judicious use of judgement. The humanitarian sector is a
current best evidence in making decisions sensitive area where we need to be aware
about the care of individual patients. The of international and local politics and the
practice of evidence-based medicine means dynamics between the various actors
integrating individual clinical expertise involved in the delivery of humanitarian
with the best available external clinical aid. This will, sometimes, determine
evidence from systematic research”.4 access to evidence and information, but
also how humanitarian aid is delivered.
This attempt to define evidence- However, the importance of evidence
based medicine was not the first,5 but remains and, as noted in an ALNAP
it has been influential and is just as report in 2014, “the failure to generate
relevant to the humanitarian sector and use evidence in policy and response
as it is to other sectors. It stresses makes humanitarian action less effective,
how research can complement less ethical and less accountable”.7
professional judgement or other sources
WHAT IS ‘EVIDENCE’ AND WHY DO Figure A.1 shows the different elements
WE FOCUS ON RESEARCH? that should be part of evidence-informed
decision making. Our focus in this practice
The Oxford English Dictionary defines guide is on the top circle of the diagram:
‘evidence’ as “the available body of facts research and evaluation.
or information indicating whether a belief
or proposition is true or valid”,8 and, As the authors of the Alliance for Useful
similarly, in their ALNAP report on the Evidence’s ‘What Counts as Good
state of the evidence in the humanitarian Evidence?’ report state “The conduct
sector, Paul Knox Clarke and James and publication of research involves the
Darcy defined it as “information that explicit documentation of methods, peer
helps to substantiate or prove/disprove review and external scrutiny, resulting
the truth of a specific proposition”.7 in rigour and openness. These features
We follow these definitions because contribute to its systematic nature
many other definitions tend to be rather and help provide a means to judge the
unhelpful by being overly inclusive trustworthiness of findings. They also
(sometimes including almost all types offer the potential to assess the validity
of information) or by being too abstract of one claim compared to another”.9
and vague.
PRACTITIONER
EXPERIENCE DECISION STAKEHOLDERS
AND (e.g. employees),
JUDGEMENTS preferences or
values)
CONTEXT,
ORGANISATION,
ACTORS,
CIRCUMSTANCES
Based on: Barends E, Rousseau DM, Briner RB. (2014) Evidence-based Management: The Basic
Principles. Amsterdam: Center for Evidence-Based Management [www.cebma.org/wp-content/
uploads/Evidence-Based-Practice-The-Basic-Principles.pdf]
This practice guide focuses on research, time. So, decision makers require evidence
but there are many overlaps with the that can be taken ‘off the shelf’ and
field of evaluation and we discuss some combined with information on the local
approaches to evaluating impact and context to inform their choice. Fortunately,
process in Section C. We also give most it is possible to find such evidence and
attention to research that deals with we cover some of the ways to do so in
impact – whether something has had Section D.
positive or negative results – because Research is a process engaged in for
questions on impact are vital to those learning purposes. It seeks to answer
involved in humanitarian action. These questions such as ‘What was the
actors are concerned about showing their commonest type of injury after an
‘impact’ on populations, their ‘results’ in earthquake?’, ‘What are the effects on
international terms or ‘what works’ for gender-based violence of different ways
governments and local and international to protect women and children?’ or ‘How
providers. The language may change, waterproof is a particular material when
but the idea for their research stays the used for shelter?’
same: to see if they have really made
a difference. Therefore, our aim with Evaluation is a process involving the
this guide is to help you decide how assessment of findings and observations
that research might help you choose against standards, for the purpose
interventions, actions and strategies and of making decisions. Evaluations ask
adopt policies that are most likely to questions such as ‘Which types of first
make a positive difference. We provide aid should first responders be trained
illustrative examples throughout the in?’, ‘Which is the best way to protect
guide, and further examples of the use women and children from gender-based
of evidence in the humanitarian sectors violence?’ or ‘What material should be
are available in other collections of case used for making tents in a setting with
studies.10 heavy rainfall?’
We give prominence to research and Research does not necessarily require
evaluation that is ready-made, with no evaluation. However, doing evaluation
need to run a brand-new study. Decision always requires doing research. An
makers have limited time and resources evaluation relates to an intervention
and many simply cannot afford to that was actually implemented, while
commission such a study and to wait for research is more comprehensive and, as
its results to become available, which may well as including evaluations, it can also
take years. Someone needing to make a seek to answer conceptual questions,
decision now, needs the evidence now, if such as when planning for the needs that
not yesterday, not in a year or more years’ are likely after a disaster or developing
Optimism bias about both old and new This is not to say that professional
interventions is often due to a lack of judgement is always wrong. Researchers
evidence about their true effects. such as Gary Klein have sung the praises
Research and evidence from evaluations of intuitive expert judgement, for instance
of these interventions, or similar ones, in his work on ‘naturalistic decision
can help to reduce this uncertainty. making’.14 Professional views and gut-
instincts can be highly valuable, but we
Just as in other sectors, there are also must be aware of their downsides. As
many other biases relating to how people Daniel Kahneman asserted in a joint
think that can afflict those working in the article with Professor Klein in American
humanitarian sector. These include: Psychologist, “professional intuition is
sometimes marvellous, and sometimes
Hindsight bias: Tendency to see past flawed”.15
events as being more predictable than
they were before the event occurred.
Loss aversion: Tendency to prefer
avoiding losses than to acquiring gains.
Framing effect: Drawing different
conclusions from the same information
presented in different ways (e.g. would
you prefer that ‘95% returned to work’ or
that ‘5% did not return to work’?).
The ‘availability heuristic’: When people
relate the size, frequency or probability of
a problem to how easy it is to remember
or imagine.
The ‘representativeness heuristic’:
When people overestimate the probability
of vivid events.
The ‘need for coherence’: The urge to
establish patterns and causal relationships
when they may not exist.
Meta-cognitive bias: The belief that
we are immune from biases!
CASE STUDY
NEEDS ASSESSMENT
AND ANALYSIS
OPERATIONAL STRATEGIC
PEER REVIEW & PLANNING
EVALUATION
COORDINATION
INFORMATION
MANAGEMENT
7
1
6
3
5
4
CASE STUDY
CASE STUDY
CASE STUDY
Assumptions
• Aid selection appropriate to context
• 'Do no harm' principle followed
ACTIVITIES
OUTPUT
• Sensitization of non-beneficiary groups
• Shelter items used for intended purpose Preconditions
• Needs assessment relevant & needs Effective & timely
• Beneficiary willing to participate
of vulnerable groups accounted for shelter provision • Local leadership structure in place
• Functional logistics, coordination • Stakeholders support intervention
& communication
Rapid needs
SHORT/MEDIUM TERM OUTCOMES assessment
ShelterBox response is accountable
Protection from weather & environmental extremes
Monitoring Coordination Community Logistics Safety In-depth needs
& reflection & reporting engagement & security assessment
Increased personal safety & security of possessions
Physical
Protection
Households and communities are intact OUTCOME • Improved psychological health & wellbeing
• Improved access to & retention in education
Knowledge & skills to utilise the provided materials • Reduced morbidity & mortality
Improved resilience &/
• Security of tenure
SECTION B | WHEN CAN EVIDENCE HELP YOU?
well being
Psychological
'Outcomes and impacts - measuring the difference we make':
https://cwarham.wordpress.com/2017/02/02/shelterbox-theory-of-change
BACK TO SECTION C | WHAT EVIDENCE SHOULD YOU CHOOSE?
CONTENTS
CASE STUDY
The type of research you chose as the source of evidence to help in your decision
making needs to fit the needs of the challenge that you face.39
Table C.1 Different designs, methods and approaches to research evidence – a brief overview
TYPES OF
RESEARCH AND WHAT IS IT? PROS CONS
EVALUATION
TYPES OF
RESEARCH AND WHAT IS IT? PROS CONS
EVALUATION
TYPES OF
RESEARCH AND WHAT IS IT? PROS CONS
EVALUATION
TYPES OF
RESEARCH AND WHAT IS IT? PROS CONS
EVALUATION
TYPES OF
RESEARCH AND WHAT IS IT? PROS CONS
EVALUATION
CASE STUDY
(i.e. a comparison with what would have greater risk of bias than well-conducted
happened without the intervention), any randomised trials, but they might still
observed impact on outcomes may be allow strong causal inferences to be made
due to factors other than the intervention. in circumstances where a randomised trial
Quasi-experimental designs have a would not be possible or acceptable.54
CASE STUDY
but it usually refers to documents that quality, such as the GRADE68 or Maryland
are unpublished or have been published Scientific Methods Scale69 systems. These
without peer review. It can also refer to approaches to quality assessment for
research that is still underway or being experimental evaluations are usually
prepared for publication. Government based on the studies’ internal validity,
reports, policy statements and briefs, quality of reporting and external validity.
and conference proceedings are also Quality refers to how well studies have
types of grey literature. Grey literature been conducted, reported and analysed,70
is important because it may contain as well as the researchers’ integrity in not
evidence of negative outcomes and distorting or falsifying their data.71 Some
unsuccessful interventions, which is people also link quality to how relevant
important for the balance of evidence- the study is to policy and practice.72
informed decision making. Grey literature
can be searched using electronic When trying to answer a causal question,
databases such as Open Grey (www. you need to consider whether the
opengrey.eu), conference proceedings research design used for a study is
and the procurement records of research appropriate for determining causality and
funders. Websites of organisations that whether the design was implemented
have an interest or expertise in a topic properly in the study. High-quality impact
are another source of grey literature. evaluations will answer questions of
These organisations can be contacted to attribution: showing that the intervention
identify researchers and decision makers caused the outcomes. This requires a
who have expertise in a topic or issue. comparison or control group which is
as similar as possible to the intervention
A good start in trying to appraise the group in all regards expect the actual
quality of evidence is defining it. One of intervention. If this is true and the study
the problems, however, is that phrases has been well conducted, you can be
such as ‘quality’, ‘standards’, ‘robustness’, more confident that, for example, the
‘bias’ and ‘strength’ are often used as if effects on the prevention of violence,
they were interchangeable, and without reduction in family stress or faster return
clearly defining what they mean. This to work are due to the intervention. It
makes for a lot of misunderstanding. For is also important to consider whether
instance, in some guidance,67 research the effects found in the study will be
‘quality’ means using particular types replicated in other places. This drives the
of design and method – such as a demand for mixed methods of research
randomised trial. This focus on minimising and evaluation and might also require
bias as a means of ensuring quality arises information from qualitative research.
from some of the formal clinical and
health approaches to assessing evidence
is usually
Berger (2009) -1.269 (-1.607 | -0.932) W:5.1
represented by
Catani (2009) 0.251 (-0.47 | 0.971) W:1.1
a forest plot81
Chen (2014) -1.277 (-2.41 | -0.144) W:0.5
such as that Chen (2014) -0.138 (-1.152 | 0.875) W:0.6
in Figure C.1, Cluver (2015) 0.59 (-0.257 | 1.436) W:0.8
which is taken Dybdahl (2001) -0.137 (-0.558 | 0.284) W:3.3
from a recent Ertl (2011) -0.457 (-1.12 | 0.206) W:1.3
systematic Ertl (2011) -0.035 (-0.7 | 0.629) W:1.3
review on Gordon (2008) -1.116 (-1.595 | -0.637) W:2.5
the impact Jordans (2010) -0.18 (-0.858 | 0.498) W:1.3
of support Khamis (2004) 0.205 (0.006 | 0.404) W:14.6
programmes Khamis (2004) 0.066 (-0.178 | 0.311) W:9.7
stress disorders (PTSD) as a continuous systematic review. They use the same basic
variable. For each study in this forest structure and stages of full systematic
plot, the red dot represents the average review, but are not as intensive, exhaustive
treatment effect of the intervention, and or comprehensive. They will take more
the parallel lines either side of the red short cuts with the searching, critical
dots represent the confidence interval appraisal, data extraction and gathering,
for that study. The solid black vertical and statistical analysis of included
line running from 0 on the horizontal axis studies. The findings are also presented
indicates no difference between using in a shorter and less detailed form than
and not using the programme, and the a full systematic review, and might be
results of all the studies are pooled to no longer than 25 pages, with a three-
provide the overall estimate of the effects page executive summary and a one-page
of the programmes. This new, summary briefing document for decision makers.
statistic is the black diamond (circled The limitations of rapid evidence
in red) at the bottom of the forest plot. assessments are that they are not as
This represents the cumulative estimate comprehensive or exhaustive as systematic
of effect of pooling and aggregating the reviews and are more likely to be subject
average effects sizes and the variances to bias than a full systematic review.
of all 28 impact evaluations included Consequently, greater caution is needed
in the review. It allows us to conclude when basing a decision on evidence from
that, on average, the mental health and a rapid evidence assessment than from
psychosocial support programmes have a a full systematic review. Notwithstanding
small, positive effect on PTSD compared these limitations, they are frequently
with not using these interventions. commissioned and used by policy
makers and programme implementers,
RAPID EVIDENCE ASSESSMENTS especially where time is of the essence
The preparation of systematic reviews and and no systematic reviews are available.
meta-analyses can be time-consuming.
This means that if an up-to-date A particular type of rapid review, called
systematic review is not available, people a Rapid Research Needs Assessment,
needing to make an urgent decision might can also be used to quickly identify
need to conduct their own searches for evidence gaps. The UK’s Public Health
the relevant pieces of evidence and then Rapid Support Team for disease
appraise and synthesise this faster than outbreaks includes a plan to conduct
would happen in a formal systematic these assessments with Evidence Aid,
review. Fortunately, the already large to identify important uncertainties that
number of systematic reviews is continuing could be tackled by research in the early
to grow rapidly, and we describe how to stages of a humanitarian emergency
find them and several collations that are associated with a disease outbreak.
available in Section D. However, if you are
unable to find what you are looking for THE IMPORTANCE OF REPETITION
amongst the existing systematic reviews, AND CORROBORATION
or the reviews you find are out of date, you Thinking about the concept of evidence
might need to think about commissioning gaps, brings us to one of the other
a ‘pared-down systematic review’, such as things that needs to be considered when
a rapid evidence assessment.83 These rapid assessing the quality of a summary
reviews normally take 1-3 months and are of research studies: the number of
timed to meet the needs of policy makers studies that need to be included for
and practitioners who cannot wait for a full you to be comfortable that the body
And, finally, you need to decide The International Initiative for Impact
on whether you will apply any Evaluation (3ie) was established in 2008
restrictions based on language and and now offers four searchable databases
the time period in which the research online (www.3ieimpact.org). Two of these,
was conducted or published. the 3ie Database of Systematic Reviews
and the Database of Impact Evaluations
catalogue evidence of the effectiveness
of interventions in the humanitarian
sector. These databases also include
We are maintaining an up-to-date fuller list of these types of resources online, at:
www.evidenceaid.org/online-collections-of-research-for-the-humanitarian-sector
The list provides a wide range of online research resources, many of which are
free and easy to access. These should be useful to any policy maker, NGO or
frontline professional in the humanitarian sector, providing easy access to reliable,
high quality evidence on the effectiveness of interventions. If you would like
to suggest additional resources for this list, please contact Evidence Aid:
info@evidenceaid.org
CONCLUSION
In conclusion to this guide on the use of evidence in the humanitarian sector,
we encourage you to take advantage of the freely available, accessible
and actionable summaries of research, such as the systematic reviews
contained on the websites we have listed. This will help you to move quickly
to sources of evidence that should help inform your policy and practice.
Endnotes
8
www.oxforddictionaries.com/
definition/english/evidence.
9
Nutley S, et al. (2013) What Counts
Collins S, Sadler K. (2002) Outpatient care for
1 as Good Evidence? London, UK:
severely malnourished children in emergency Alliance for Useful Evidence.
relief programmes: a retrospective
cohort study. Lancet 360:1824-30.
10
Hallam A, Bonino F. (2013) Using Evaluation
for a Change: Insights from humanitarian
2
WHO, World Food Programme, UN practitioners. ALNAP Study. London:
System Standing Committee on Nutrition, ALNAP/ODI. See www.alnap.org/system/
and UNICEF. (2007) Community-based files/content/resource/files/main/alnap-
management of severe acute malnutrition: study-using-evaluation-for-a-change.pdf.
a joint statement by the WHO, World Food
Programme, UN System Standing Committee
11
Ariely D. (2009) Predictably Irrational:
on Nutrition and UNICEF. New York: UNICEF. The Hidden Forces that Shape Our
Decisions. London, UK: HarperCollins.
3
rice AI, Djulbegovic B. (2017) What
P
does evidence mean? Most languages
12
M Treasury. (2011) The Green Book:
H
translate “evidence” into “proof”. Journal of Appraisal and Evaluation in Central
Evaluation in Clinical Practice 23(5):971-3. Government. London, UK: HM Treasury.
4
Sackett D, et al. (1996) Evidence based
13
Christoplos I. (2006) Links between
medicine: what it is and what it isn’t. Relief, Rehabilitation and Development
British Medical Journal 312:71. See www. in the Tsunami Response: A Synthesis
bmj.com/content/312/7023/71. of Initial Findings. Stockholm,
Sweden: Swedish International
5
The definition of evidence-based Development Cooperation Agency.
medicine came at a time when most
medical decision making was based on
14
sambok CE, Klein G, (editors). (2014)
Z
experience, authority and eminence. Naturalistic decision making. New
Medical practice was not informed by the York, USA: Psychology Press.
best available scientific evidence. Some 15
ahneman D, et al. (2009) Conditions for
K
commentators and researchers have
intuitive expertise: a failure to disagree.
argued that social policy is in the same
American Psychology 64:515-26. See
place as medicine was 20 or 30 years ago,
www.ncbi.nlm.nih.gov/pubmed/19739881.
namely that authority, rather than research
evidence, dominates decision making. 16
These estimates come from Cash Learning
Partnership (CaLP). (2018) The State of
6
llen C, et al. (2016) Evidence Aid. Oxford
A
the World’s Cash Report. Oxford: CaLP;
Public Health August: 51-54; Blanchet K,
Doing Cash Differently: Report of the High
et al. (2017) Evidence on public health
Level Panel on Cash Transfers. (2015); and
interventions in humanitarian crises. Lancet
Development Initiatives. (2017) Global
390:2287-96; and Christoplos I, et al. (2017)
Humanitarian Assistance Report.
Strengthening the quality of evidence in
humanitarian evaluations. ALNAP Method 17
DI, Development Initiatives. (2016)
O
Note. London: ALNAP/ODI (see www.alnap.
Counting Cash: Tracking Humanitarian
org/system/files/content/resource/files/
Expenditure on Cash-Based Programming.
main/alnap-eha-method-note-5-2017.pdf).
18
xfam. (2006) Good Practice Review
O
7
Knox Clarke P, Darcy J. (2014) Insufficient
11: Cash Transfer Programming in
evidence? The quality and use of evidence
Emergencies. Oxford: Oxfam.
in humanitarian action. ALNAP Study.
London: ALNAP/ODI. See www.alnap.
org/system/files/content/resource/
files/main/alnap-study-evidence.pdf.
19
Jackson R (Save the Children UK), Kukrety 29
Waddington H, et al. (2009) Water,
N (Oxfam GB) (2012) Institutionalising Sanitation and hygiene interventions to
cash transfer programming. See https:// combat childhood diarrhoea in developing
odihpn.org/magazine/institutionalising- countries: a systematic review. Delhi, India:
cash-transfer-programming/. International Initiative for Impact Evaluation.
20
See www.humanitarianresponse. 30
Nesta and TSIP. (2014) Guidance for
info/en/programme-cycle/space. Developing a Theory of Change for Your
Programme. See www.nesta.org.uk/
Discussion of the challenges of making
21
sites/default/files/theory_of_change_
decisions following the rare circumstances guidance_for_applicants_.pdf.
of a major radition emergency is available
in Carr Z, et al. (2016) Using the GRADE 31
tern E. (2015) Impact Evaluation: A
S
approach to support the development Design Guide for Commissioners and
of recommendations for public health Managers of International Development
interventions in radiation emergencies. Evaluations in the Voluntary and Community
Radiation Protection Dosimetry 171:144-55; Sector. London, UK: Big Lottery Fund,
and Ohtsuru A, et al. (2015) Nuclear disasters Bond, Comic Relief and the Department
and health: lessons learned, challenges, for International Development.
and proposals. Lancet 386: 489-97.
32
erdin M, et al. (2014) Optimal evidence
G
22
Bradley DT, et al. (2014) The effectiveness in difficult settings: improving health
of disaster risk communication: a systematic interventions and decision making in
review of intervention studies. PLOS disasters. PLoS Medicine 11(4): e1001632.
Currents Disasters August 22; Edition 1.
33
Mulgan G. (2015) The six Ws: a formula for
23
A discussion of the importance of paying what works. London: Nesta. See www.nesta.
attention to how donors gather, use and org.uk/blog/six-ws-formula-what-works.
share evidence and information is available
in Obrecht A. (2017) Using Evidence to 34
www.ebola-anthropology.
Allocate Humanitarian Resources: Challenges net/about-the-network.
and Opportunities. ALNAP Working Paper.
London: ALNAP/ODI. See https://reliefweb. 35
Faye SL. (2015) L“exceptionnalité” d’Ebola
int/sites/reliefweb.int/files/resources/ et les “réticences” populaires en Guinée-
alnap-eaar-resource-allocation-2017.pdf. Conakry. Réflexions à partir d’une approche
d’anthropologie symétrique. Anthropologie
24
esta. (2013) Understand how
N & Santé. See https://journals.openedition.
innovation works. Video available org/anthropologiesante/1796.
at www.nesta.org.uk/resources/
understand-how-innovation-works. 36
http://pubman.mpdl.mpg.de/pubman/
item/escidoc:2096578/component/
25
Obrecht A, Warner AT. (2016) More than escidoc:2103624/AAA-Ebola-Report-1.pdf.
just luck: Innovation in humanitarian action.
HIF/ALNAP Study. London: ALNAP/ODI. 37
Fairhead J. (2016) Understanding Social
Resistance to the Ebola Response in the
26
Jones G, et al. (2003) How many child Forest Region of the Republic of Guinea: An
deaths can we prevent this year? Lancet Anthropological Perspective. African Studies
362:65-71. See https://linkinghub.elsevier. Review 59:7-31. doi:10.1017/asr.2016.87.
com/retrieve/pii/S0140-6736(03)13811-1.
38
Abramowitz S, et al. (2015) Social
27
Nyhan B, Reifler J. (2010) When science intelligence in the global
corrections fail: The persistence of political Ebola response. Lancet 385:330.
misperceptions. Political Behavior 32:303-30.
39
Petticrew M, et al. (2003) Evidence,
28
hite H. (2009) Theory-based impact
W hierarchies and typologies: horses
evaluation: principles and practice, for courses. Journal of Epidemiology
working paper 3. Delhi, India: International and Community Health 57:527-9.
Initiative for Impact Evaluation, page 4.
40
n authoritative and exhaustive list of social
A 47
oocy S, Burnham G. (2006). Point-
D
science research frameworks and methods of-use water treatment and diarrhoea
is available in Luff R, et al. (2015) Review of reduction in the emergency context:
the Typology of Research Methods within an effectiveness trial in Liberia. Tropical
the Social Sciences. London, UK: ESRC/ Medicine and International Health 11:1542-52.
National Centre for Research Methods.
See http://eprints.ncrm.ac.uk/3721. 48
hite H, et al. (2014) Randomised
W
Controlled Trials (RCTs). Methodological
41
epartment for International Development.
D Briefs. Impact Evaluation No. 7. Florence,
(2014) How to Note: Assessing the Italy: Unicef. See www.unicef-irc.org/
Strength of Evidence. See www.gov.uk/ publications/pdf/brief_7_randomised_
government/uploads/system/uploads/ controlled_trials_eng.pdf.
attachment_data/file/291982/HTN-
strength-evidence-march2014.pdf. 49
Gertler P. (2000) Final Report: The Impact
of PROGRESA on Health. Washington, DC,
42
Adapted from HM Treasury, DECC and DEFRA. USA: International Food Policy Research
(2012) Quality in policy impact evaluation: Institute (IFPRI). See www.ifpri.org/
understanding the effects of policy from other publication/impact-progresa-health.
influences. London, UK: HM Treasury/DEFRA/
DECC; Frost S, et al. (2006) The Evidence 50
ainmueller J, et al. (2017) Catalyst or
H
Guide; Using Research and Evaluation in Social crown: does naturalization promote
Care and Allied Professions. London, UK: the long-term social integration
Barnardo’s; Petticrew M, Roberts H. (2003) of immigrants? American Political
Evidence, hierarchies and typologies: horses Science Review 111(2):256-76.
for courses. Journal of Epidemiology and doi:10.1017/S0003055416000745.
Community Health. 57: 527-9; and Stern E.
(2015) Impact Evaluation; A Design Guide for 51
Bozorgmehr K, et al. (2015) Effect of
Commissioners and Managers of International restricting access to health care on
Development Evaluations in the Voluntary and
health expenditures among asylum-
Community Sector. London, UK: Big Lottery
Fund, Bond, Comic Relief and the Department seekers and refugees: A quasi-
for International Development, Table 2, page 18. experimental study in Germany,
1994–2013. PLoS ONE 10: e0131483.
43
Odgaard-Jensen J, et al. (2011) Randomisation
to protect against selection bias in
52
ossi R, et al. (2016) Vaccination
R
health care trials. Cochrane Database of coverage cluster surveys in Middle
Systematic Reviews (4):MR000012. Dreib - Akkar, Lebanon: comparison of
vaccination coverage in children aged
44
White H. (2013) An introduction to the use 12-59 months pre- and post-vaccination
of randomised control trials to evaluate campaign. PLoS ONE 11(12): e0168145.
development interventions. Journal of
Development Effectiveness 5(1):30-49. 53
Alexander J, Bonino F (2015). A discussion
on designs, approaches and examples,
45
See for example: Schulz KF, et al. (2010) ALNAP Discussion Series Improving
CONSORT 2010 statement: updated guidelines the quality of EHA evidence, Method
for reporting parallel group randomised Note 4, January 2015. See www.alnap.
trials. PLoS Medicine 7(3):e1000251; org/system/files/content/resource/
Higgins JP, et al. (2011) The Cochrane files/main/alnap-eha-method-note-
Collaboration's tool for assessing risk of addressing-causation-jan2015.pdf.
bias in randomised trials. BMJ 343:d5928.
54
Waddington H, et al. (2017) Quasi-
46
Puri J, et al. (2015) What methods may be experimental study designs series paper
used in impact evaluations of humanitarian 6: risk of bias assessment. Journal of
assistance? working paper 22, Delhi, India: Clinical Epidemiology 89:43-52; and
International Initiative for Impact Evaluation. Bärnighausen T, et al. (2017) Quasi-
See www.3ieimpact.org/media/filer_
experimental study designs series—paper
public/2014/12/08/wp_22_humanitarian_
7: assessing the assumptions. Journal
methods_working_paper-top.pdf.
of Clinical Epidemiology 89:53-66.
55
Nielsen NS, et al. (2013). The Contribution
65
Franco A, et al. (2014) Social science.
of Food Assistance to Durable Solutions Publication bias in the social sciences:
in Protracted Refugee Situations; its unlocking the file drawer. Science 345:1502-5.
impact and role in Bangladesh: A Mixed 66
ense about Science. (2006) I
S
Method Impact Evaluation, Volume
don’t know what to believe: Making
I-Evaluation Report, Geneva, Switzerland:
sense of science stories. See www.
World Food Program/UNHCR.
senseaboutscience.org/resources.
56
Carter R. (2012) Helpdesk Research Report: php/16/i-dont-know-what-to-believe.
Theory-based evaluation approach. University 67
See for instance, the supplement to the
of Birmingham, UK: Birmingham: Governance
Magenta Guide: HM Treasury, DECC and
and Social Development Resource Centre.
DEFRA (2012) Quality in policy impact
See www.gsdrc.org/docs/open/hdq872.pdf.
evaluation; understanding the effects of
57
Westhorp G. (2014) Realist Evaluation: An policy from other influences. The guidance
Introduction. London, UK: Methods Lab, shows how ‘higher quality research designs
Overseas Development Institute. See www. can help meet the challenge of attributing
odi.org/sites/odi.org.uk/files/odi-assets/ measured outcomes to the policy in
publications-opinion-files/9138.pdf. question (as opposed to other influences),
whereas lower quality designs reduce
58
Mayne J. (2012) Contribution Analysis: confidence in whether it was the policy
Coming of Age? Evaluation 18(3):270-80. that achieved those outcomes’ (page 5).
59
For network analysis and process tracing
68
http://gradeworkinggroup.org.
see: Befani B, et al. (2014) Process tracing
and contribution analysis: a combined
69
arrington RP, et al. The Maryland Scientific
F
approach to generative causal inference for Methods Scale, In: Farrington DP, et
impact evaluation. IDS Bulletin 45(6):17-36. al. Evidence-based crime prevention.
London, UK: Routledge, 2002 chapter 2.
60
Baptist C, et al. (2015) Coffey How To
Guide: Qualitative Comparative Analysis
70
or a model of different approaches on
F
– A Rigorous Qualitative Method for quality that includes the four dimensions
Assessing Impact. See www.coffey. of (1) methodological quality (2) quality
com/assets/Ingenuity/Qualitative- in reporting, (3) appropriateness and
Comparative-Analysis-June-2015.pdf. (4) relevance to policy and practice; see
Boaz A, Ashby D. (2003) Fit for purpose?
Stern E, et al. (2012) Broadening the
61 Assessing research quality for evidence
range of designs and methods for impact based policy and practice. Working
evaluations, London, UK: Department Paper 11. London, UK: ESRC UK Centre
for International Development. for Evidence Based Policy and Practice.
62
For example, see Carol Weiss’s guide: Callaway E. (2011) Report finds
71
63
Ioannidis JA. (2005) Contraindications and
72
Boaz A, et al. (2003) Fit for purpose?
Initially Stronger Effects in Highly Cited Assessing research quality for evidence
Clinical Research. Journal of the American based policy and practice. Working
Medical Association. 294(2):218-28. Paper 11. London, UK: ESRC UK Centre
for Evidence Based Policy and Practice.
64
or a systematic review of publication bias see
F
Hopewell S, et al. (2009) Publication bias in
73
Qualitative research explores and tries to
clinical trials due to statistical significance or understand people’s beliefs, experiences,
direction of trial results. Cochrane Database attitudes, behaviour and interactions. It
of Systematic Reviews (1):MR000006. doi: generates non-numerical data which might
10.1002/14651858.MR000006.pub3. be gathered through, for example in-depth
interviews, focus groups, documentary
analysis and participant observation.
74
Spencer L, et al. (2002) Quality in Qualitative 82
Bangpan M, et al. (2017) The impact
Evaluation: A framework for assessing of mental health and psychosocial
research evidence. London, UK: Cabinet support programmes for populations
Office; CASP. (2018) CASP Checklist for affected by humanitarian
Qualitative Research. Oxford, UK: Critical emergencies. Oxford, UK: Oxfam.
Appraisal Skills Programme; National
Institute for Health and Clinical Excellence. 83
Ganann R, et al. (2010) Expediting
(2012) The guidelines manual: Appendix H: systematic reviews: methods and
Methodology checklist: qualitative studies. implications of rapid reviews.
London, UK: National Institute for Health Implementation science 5:56; and Tricco
and Clinical Excellence; and O'Brien BC, et AC, et al. (2015) A scoping review of rapid
al. (2014) Standards for reporting qualitative review methods. BMC Medicine 13: 224.
research: a synthesis of recommendations.
Academic Medicine 89: 1245-51. 84
Begley CG, et al. (2012) Drug development:
Raise standards for preclinical cancer
75
Whitty CJM. (2015) What makes an research. Nature 483: 531-3.
academic paper useful for health
policy? BMC Medicine 13:301. 85
review of research that had included
A
one or more cumulative meta-analysis
76
ee, for example, the discussion of the
S found many examples showing that stable
importance of using existing evidence when results (beneficial, harmful and neutral)
designing new studies in Clarke M. (2004) would have been seen had a meta-analysis
Doing new research? Don't forget the old: of existing research been done before a
nobody should do a trial without reviewing new randomised trial began, which would
what is known. PLoS Medicine 1: 100-2; and the have led to earlier uptake of effective
history of systematic reviews in Chalmers I, et interventions: Clarke M, et al. (2014)
al. (2002) A brief history of research synthesis. Accumulating research: a systematic
Evaluation and the Health Professions 25: 12- account of how cumulative meta-
37; and Clarke M. (2016) History of evidence analyses would have provided knowledge,
synthesis to assess treatment effects: personal improved health, reduced harm and saved
reflections on something that is very much resources. PLoS ONE 9(7): e102670.
alive. JLL Bulletin: Commentaries on the
history of treatment evaluation. Journal of 86
halmers I, et al. (2014) How to increase
C
the Royal Society of Medicine 109: 154-63. value and reduce waste when research
priorities are set. Lancet 383: 156-65.
77
Allen C. (2014) A resource for those preparing
for and responding to natural disasters, 87
hea BJ, et al. (2007) Development of
S
humanitarian crises, and major health care AMSTAR: a measurement tool to assess
emergencies. Journal of Evidence-Based the methodological quality of systematic
Medicine 7: 234-7; and Gurevitch J, et al. reviews. BMC Medical Research Methodology
(2018) Meta-analysis and the science of 7:10; and Shea BJ, et al. (2017) AMSTAR
research synthesis. Nature 555: 175-82. 2: a critical appraisal tool for systematic
reviews that include randomised or
78
xman AD. (1994) Checklists for
O non-randomised studies of health care
review articles. BMJ 309: 648-51. interventions, or both. BMJ 358:j4008.
79
For a critique of applying research syntheses 88
Smith V, et al. (2011) Methodology in
into policy, see Pawson R. (2001) Evidence conducting a systematic review of systematic
Based Policy: In Search of a Method. Working reviews of health care interventions. BMC
Paper 3. London, UK: ESRC UK Centre for Medical Research Methodology 11:15.
Evidence Based Policy and Practice.
89
ee, for example: Brennan RJ, et al. (2005)
S
80
Donnelly CA, et al. (2018) Four principles for Rapid health assessment in Aceh Jaya
synthesizing evidence. Nature 558: 361-4. district, Indonesia, following the December
26 tsunami. Emergency Medicine Australasia
81
Lewis S, Clarke M. (2001) Forest plots: trying to 17: 341e350; and Beebe J. (2014) Rapid
see the wood and the trees. BMJ 322:1479-80. Qualitative Inquiry: a Field Guide to Team-
based Assessment, second edition. Lanham,
Maryland, USA: Rowman & Littlefield.