Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

Evaluation and Program Planning 58 (2016) 119

Contents lists available at ScienceDirect

Evaluation and Program Planning


journal homepage: www.elsevier.com/locate/evalprogplan

Narrative review of strategies by organizations for building evaluation


capacity
Sophie Nortona , Andrew Milatb,c , Barry Edwardsb , Michael Gifnb,*
a
Western Sydney Public Health Unit, New South Wales Health, 5 Fleet Street, North Parramatta, New South Wales, 2151, Australia
b
Evidence and Evaluation Unit, Centre for Epidemiology and Evidence, New South Wales Ministry of Health, 73, Miller Street, North Sydney, New South
Wales, 2059, Australia
c
Adjunct Associate Professor, Sydney Medical School, University of Sydney, New South Wales, 2006, Australia

A R T I C L E I N F O A B S T R A C T

Article history:
Received 24 September 2015 Program evaluation is an important source of information to assist organizations to make evidence-
Received in revised form 23 March 2016 informed decisions about program planning and development. The objectives of this study were to
Accepted 27 April 2016 identify evaluated strategies used by organizations and program developers to build the program
Available online 11 May 2016 evaluation capacity of their workforce, and to describe success factors and lessons learned. Common
elements for successful evaluation capacity building (ECB) include: a tailored strategy based on needs
Keywords: assessment, an organizational commitment to evaluation and ECB, experiential learning, training with a
Evaluation practical element, and some form of ongoing technical support within the workplace. ECB is a relatively
Program planning
new eld of endeavor, and, while existing studies in ECB are characterized by lower levels of evidence,
Program evaluation
they suggest the most successful approaches to ECB are likely to be multifaceted. To build the level of
Capacity building
Evidence-based practice evidence in this eld, more rigorous study designs need to be implemented in the future.
Crown Copyright 2016 Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND
license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

1. Introduction every stage of the program lifecycle. In Australia, the New South
Wales Government Evaluation Framework August 2013 denes
Evaluation is an important tool to assist organizations to test evaluation as: a systematic and objective process to make
new ideas and learn what does and does not work. In an judgments about the merit or worth of 1 or more programs,
environment of limited resources, program evaluation is an usually in relation to their effectiveness, efciency and appropri-
essential source of information to assist an organization in making ateness (2). This framework mandates that evaluation takes
informed decisions (NSW Department of Premiers and Cabinet, places across the program lifecycle. While there is information
2013, 2016). Decisions on resource allocation, based on the available on how to conduct an evaluation, evidence of how to
rigorous evaluation of individual projects and initiatives, can also build an organizations evaluation capacity, so it becomes a part of
foster greater organizational accountability (Carman, 2013). At the everyday practice, is more elusive (Adams & Dickinson, 2010;
program level, evaluation allows program managers to: under- Fleischer, Christie, & LaVelle, 2008; Labin, Duffy, Meyers, Wanders-
stand the nuances of a program, inform and make necessary man, & Lesesne, 2012; Naccarella et al., 2007).
modications, and communicate this information to stakeholders There are differing opinions about what evaluation capacity and
(Brazil, 1999; Danseco, 2013; Heider, 2011; Suarez-Balcazar & evaluation capacity building (ECB) in an organization entails.
Taylor-Ritzler 2013; Surez-Herrera, Springett, & Kagan, 2009; Roe Nacarella et al., 2007 dene evaluation capacity building as
& Roe, 2004). Evaluation is therefore a necessary component of equipping staff within organizations with the appropriate skills to
programs or initiatives carried out by an organization. conduct rigorous evaluations, and doing so in a manner that
Increasingly, government organizations are recognizing evalu- acknowledges and ensures that such evaluations become part of
ation as an integral part of managing government programs at routine practice. Prior literature in the eld of ECB depicts the
variety of ECB methods used by organizations. This literature
describes strategies and organizational features purported to
support an increase in ECB (Connolly & York, 2002; Cousins, Goh,
* Corresponding author.
Clark, & Lee, 2004; Nielsen, Sebastian, & Majbritt, 2011; Preskill &
E-mail addresses: sophie.norton@health.nsw.gov.au (S. Norton),
amila@doh.health.nsw.gov.au (A. Milat), baedw@doh.health.nsw.gov.au Boyle, 2008b). While the literature offers expert knowledge, which
(B. Edwards), mgiff@doh.health.nsw.gov.au (M. Gifn).

http://dx.doi.org/10.1016/j.evalprogplan.2016.04.004
0149-7189/Crown Copyright 2016 Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/
4.0/).
2 S. Norton et al. / Evaluation and Program Planning 58 (2016) 119

provides a basis on which to construct ECB activities, rigorous the words building and developing by 3 words. This is depicted
evaluation of these activities is often absent. as adj3. The resultant combinations of search terms were then
Our study uses a meta-narrative approach to review and searched for adjacency to evaluation by 10 words, depicted as
synthesize the literature describing strategies, models and adj10. The symbol $ was then used to truncate develop and
frameworks for organizational evaluation capacity building. This evaluate to include variations on both of these words in the
approach, which has emerged over the last decade, helps to search. Finally, the rst 20 pages in the Google search were
characterize and conceptualize information in a specic topic area reviewed for appropriate inclusions.
taken from heterogeneous literature (Greenhalgh et al., 2005;
Wong, Greenhalgh, Westhorp, Buckingham, & Pawson, 2013). The 2.2. Thematic analysis
objectives of our study were to identify ECB strategies employed by
organizations and program developers, focus on those efforts that A third phase involved a team discussion of included
have concurrently or subsequently undergone evaluation, and publications to identify key themes. The team rened the inclusion
describe key success factors and lessons learned. criteria to include only studies that used qualitative and/or
quantitative research methods to assess either/or the reach,
2. Methods acceptability, and effectiveness of evaluation capacity building
activities and strategies. The rst author conducted the thematic
2.1. Literature review analysis according to the rened inclusion criteria and key themes.
Information about the included studies is summarized in
Our literature review included publications that focused on Appendix A, which provides details of: study type, level of
strategies, frameworks, and processes that build program evalua- evidence, target audience for the ECB activity such as the sector or
tion capacity on an individual, program, and organizational level. staff type trained, country the strategy or study occurred in,
The study design, search terms, and databases covered in our elementsactivities of the ECB strategyframeworkeffort, how
literature search are described in Table 1. the ECB activitystrategyframework was evaluated, and impor-
The literature review was conducted in 2 phases in January tant ndings or success factorsbarriers. The rst author also
2015. In Phase 1, abstracts were retrieved and reviewed against the recorded the frequency of mentions of success factors and barriers.
review criteria. In Phase 2, full papers were retrieved for abstracts These are provided in Appendix B.
that met the review criteria in Phase 1. These were assessed against
the review criteria. 2.3. Levels of evidence
Studies included in the review met the following criteria:
A cornerstone of evidence based practice across health and
 were published in English from January 1990 to January 2015, human services is the hierarchical system of classifying evidence
and (Centre for Evidence-Based Medicine, 2009). This hierarchy is
 clearly articulated a process, theory, model, or framework for known as the levels of evidence. There are ranges of evidence
building the capacity to execute program evaluation within an hierarchies. This review used an adaption of the Australian
organization, or National Health and Medical Research Councils 4 levels of
 described a tool or method developed to assist individuals within evidence (National Health and Medical Research Council, 2000)
organizations with the process of executing evaluations, or which is comparable to other internationally accepted hierarchies.
 were theoretical and opinion pieces, case studies, descriptive The authors added a 5th level which includes qualitative
studies, or intervention studies. perspectives and expert opinions. These 5 levels are described
in greater detail below:
Studies excluded from the review after Phases 1 and 2 were
those that:  Level I: evidence obtained from a systematic review of all
relevant randomized controlled trials (RCTs).
 evaluated capacity building programs not related to evaluation  Level II: evidence obtained from at least 1 properly designed RCT.
capacity building (for example, those that described manage-  Level III-1: evidence obtained from well-designed pseudo RCTS
ment capacity building), (alternate allocation or some other method).
 described an evaluation of a clinical intervention or specic  Level III-2: evidence obtained from comparative studies
health program (including systematic reviews of such studies) with concurrent
 were dissertations that could not be accessed. controls and allocation not randomized, cohort studies, case-
control studies, or interrupted time series with a control group.
The search process is summarized in Fig. 1. During the search,  Level III-3: evidence is obtained from comparative studies with
adjacency of the search terms was used to connect the terms more historical control, 2 or more single arm studies, or interrupted
closely. Capacity and capability were searched for adjacency to time series without a parallel control group.

Table 1
Study designs, search terms, and databases included in the literature search of program and organizational evaluation capacity building.

Study descriptions Review search terms Review databases


Theoretical and opinion pieces Capacity building OR MEDLINE (general medicine): 1946 to present
Case studies Capability building OR OVID Nursing Database: 1946 to present
Descriptive studies Capability development OR EMBASE: 1980 to 2014
Intervention studies Capacity development OR AAMED (Allied and contemporary medicine): 1985 to Jan 2015
Frameworks Facility and EBM Reviews: Cochrane Database of Systematic Reviews 2005 to Nov 2014
Systematic reviews Evaluation PsycINFO (psychology and related behavioral and social sciences): 1987 to week 5 Dec 2014
Google
Google Scholar
S. Norton et al. / Evaluation and Program Planning 58 (2016) 119 3

No limits: 1990January 2015


capacity adj3 building n = 9,611
capability adj3 building n = 205
capacity adj3 develop$ n= 9,328
capability adj3 develop$ n= 1,638
evaluat$ n= 6,069,007

Limited to English, Abstract and Human: 1990Jan


2015
capacity adj3 building n = 7,379
capability adj3 building n = 138
capacity adj3 develop$ n= 5,180
capability adj3 develop$ n= 865
evaluat$ n= 3,5409,46
Total n= 163,378

Evaluat$ and search terms combined with AND


1. capacity adj3 building adj10 evaluat$ n= 491
2. capability adj3 building adj10 evaluat$ n= 9
3. capacity adj 3 develop$ adj10 evaluat$ n= 194
4. capability adj3 develop$ adj10 evaluat$ n= 20

Total n= 682

Excluded:
Duplicate records (n= 253)
Did not meet review criteria (n=282)

Included:
Additional publications identified in
Google search
o grey literature (n=3)
o peer reviewed articles (n=7)

Phase
Phase11 Excluded:
Review of abstracts (n = 157) Did not meet review criteria
after abstract review (n = 69)
Dissertations (n = 7)
Phase 2
Review of titles and full papers (n = 81) Excluded:
Did not meet review criteria
after review of full papers
(n = 17)
Included in review: focus on concepts, theories and
models n = 64
ECB models of concepts (n = 9) Excluded:
ECB modelsframeworks for practical application (n = 29) Did not include an evaluation
Evaluation of ECB (n = 26) of an ECB intervention or
ECB framework
implementation (n = 43)
Phase 3
Inclusion of studies using research methods to assess
reach, acceptability, effectiveness of ECB strategies or
implementation of a framework (n = 21)

Fig. 1. Literature search process and numbers of papers identied, excluded and included in the literature search of program and organizational evaluation capacity building.

 Level IV: evidence obtained from case series, either post-test or 3. Results
pre-testpost-test
 Level V: qualitative perspectives and expert opinions. 3.1. Exclusion criteria

Consistent with other levels of evidence in medicine (Centre for Of the 64 publications included after the Phase 1 and Phase 2
Evidence-Based Medicine, 2009), Level V was added to the review processes, 43 were excluded for not formally assessing an
evidence hierarchy in order to capture relevant qualitative ECB framework or activity. Of these excluded publications, 11 were
perspectives and expert opinions in the narrative review. commentaries or opinion pieces, 2 were frameworks written for
specic organizations (gray literature), 22 were descriptive pieces
4 S. Norton et al. / Evaluation and Program Planning 58 (2016) 119

such as case studies without an element of ECB assessment, and 3 2012) of the organization as an important component of the
surveyed expert opinion regarding an ECB framework or activity study.
which was not based on implementing a specic intervention, 2  3 included having a dedicated evaluation team or internal
assessed an organizations capacity to evaluate using capacity evaluation staff (Cooke et al., 2008; McDonald et al., 2003;
proling framework, 1 was the development and validation of an Volkov, 2008) with 1 of these strategies highlighting the
evaluation capacity assessment instrument, 1 was a systematic importance of having protected time to carry out research
review that did not focus specically on evaluated studies, 1 (Cooke et al., 2008).
described a method for and conducted a needs assessment.

3.2. Inclusion criteria 3.4. Settings and organizational characteristics

The 21 publications included in the review were non- Ten (48%) of the 21 studies targeted or were funded by
experimental intervention studies (Appendix A). Sixteen did not government organizations, 7 (33%) by non-government organiza-
use a comparison group and 5 incorporated comparison of the tions, 3 (14%) were cross sector, and 1 (5%) was unclear. Four of the
intervention group over different time points of the ECB effort. This non-government organizations were indicated as not-for-prot, 3
includes comparison of the quality and numbers of evaluation nal were university-driven studies.
reports being produced by the organization over a 4-year period Thirteen (62%) studies were based in the United States of
(Fourney, Gregson, Sugerman, & Bellow, 2011); comparison of America, 3 (14%) in Australia, and the remaining 5 (24%) studies
evaluation nal report scores over 2 funding periods (Satterlund, were based in the United Kingdom, New Zealand, Israel and Kenya.
Treiber, Kipke, Kwon, & Cassady, 2013), or comparison of Seventeen (81%) of the studies were published since 2006.
evaluation nal report scores at 2 time points (Treiber, Cassady,
Kipke, Kwon, & Satterlund, 2011) after introduction of the ECB 3.5. Levels of evidence
effort; baseline, follow up, and nal surveying to assess an
organizations ECB effort (Akintobi et al., 2012); and pre- and post- The review has contrasted the ways in which researchers have
training workshop testing of knowledge and attitudes (McDuff, studied ECB and highlighted that a large proportion of the available
2001). studies are descriptive in nature rather than intervention studies.
Less than half of the studies that passed Phase 2 of the review
3.3. Types of ECB strategies implemented process used either qualitative or quantitative research methods to
assess ECB. Those studies included after Phase 3 were intervention
Implemented ECB intervention strategies had the following studies which directly assessed an ECB effort, or some aspect of an
characteristics: ECB effort, including assessing documentation to retrospectively
garner information about the ECB activities that organizations
 20 included trainingworkshops as a core ECB component have implemented (Appendix A).
(Akintobi et al., 2012; Bamberg, Perlesz, McKenzie, & Read, 2010;
Campbell & Longo, 2002; Compton, MacDonald, Baizerman, 3.6. Success factors and barriers
Schooley, & Zhang, 2008; Cooke, Nancarrow, Dyas, & Williams,
2008; Dickinson & Adams, 2012; Fleming & Easton, 2010; Despite the lack of high quality evidence, the 21 papers included
Fourney et al., 2011; Garca-Iriarte, Suarez-Balcazar, Taylor- in the review were thematically analyzed to determine key success
Ritzler, & Luna, 2011; Higa & Brandon, 2008; Kaye-Tzadok & Spiro factors and barriers to evaluation capacity building. The papers
2014; Levine, Russ-Eft, Burling, Stephens, & Downey, 2013; were reviewed and a code frame developed by the rst author in
McDonald, Rogers, & Kefford, 2003; McDuff, 2001; NuMan, King, discussion with co-authors. Success factors and barriers were not
Bhalakia, & Criss, 2007; Preskill & Boyle 2008a; Satterlund et al., always explicit and in some studies success factors were the
2013; Treiber et al., 2011; Sundar, Kasprzak, Halsall, & Woltman, elements within the presented strategy or framework on which an
2011; Volkov, 2008). intervention was assessed.
 14 included provision of either/or technical assistance, mentor-
ing, coaching, individual and/or organizational consultancy 3.6.1. Multi-dimensional approach with experiential learning
(Akintobi et al., 2012; Compton et al., 2008; Garca-Iriarte This review highlights that a successful ECB effort requires a
et al., 2011; Levine et al., 2013; Preskill & Boyle, 2008a; Satterlund multi-dimensional approach, with all studies either explicitly or
et al., 2013; Treiber et al., 2011; Bamberg et al., 2010; Campbell & implicitly identifying this within their frameworks or ECB efforts.
Longo, 2002; Cooke et al., 2008; Dickinson & Adams, 2012; Higa Evaluation training was presented as a key element across most of
& Brandon, 2008; Sundar et al., 2011; Fourney et al., 2011). the studies (Dickinson & Adams, 2012; Cooke et al., 2008; Fleming
 12 incorporated partnership development or collaboration with & Easton 2010; Fourney et al., 2011; Garca-Iriarte et al., 2011; Higa
another body to support evaluation (Akintobi et al., 2012; Cooke & Brandon, 2008; Kaye-Tzadok & Spiro, 2014; Levine et al., 2013;
et al., 2008; Compton et al., 2008; Fleming & Easton 2010; Garca- McDonald et al., 2003; McDuff, 2001; NuMan et al., 2007; Preskill
Iriarte et al., 2011; Lennie, 2005; Levine et al., 2013; McDonald & Boyle, 2008a; Satterlund et al., 2013; Compton et al., 2008;
et al., 2003; McDuff, 2001; Preskill & Boyle 2008a; Treiber et al., Sundar et al., 2011; Treiber et al., 2011; Volkov, 2008).
2011; Volkov, 2008).
 11 included development or provision of evaluation tools and/or 3.6.2. Organizational features and support
written materials such as guides (Bamberg et al., 2010; Fourney The importance of organizational structure and attitude to
et al., 2011; Lennie, 2005; Preskill & Boyle, 2008a; Satterlund evaluation for ECB was a common theme in these studies (Cooke
et al., 2013; Treiber et al., 2011; Volkov 2008; Higa & Brandon, et al., 2008; Compton et al., 2008; Dickinson & Adams,
2008; Compton et al., 2008; McDonald et al., 2003; Sundar et al., 2012Fleming & Easton 2010; McDonald et al., 2003; Preskill &
2011). Boyle, 2008a; Sundar et al., 2011; Volkov, 2008). This included
 3 included either a needs assessment (McDuff, 2001; NuMan viewing evaluation as an organizational focus or priority, increas-
et al., 2007) or an evaluation capacity assessment (Akintobi et al., ing communication about evaluation within the organization, and
demonstrating strong evaluation leadership which is seen to
S. Norton et al. / Evaluation and Program Planning 58 (2016) 119 5

inuence the evaluation culture within an organization. Strong to the contextual background of the organization, and the types of
evaluation leadership could be demonstrated by recognizing and programs they evaluate, can then be taken (Akintobi et al., 2012;
supporting evaluation in a number of ways: embedding evaluation Garca-Iriarte et al., 2011; Lennie, 2005; McDuff, 2001; NuMan
into work processes through policy and procedures that upholds et al., 2007; Preskill & Boyle, 2008a; Satterlund et al., 2013; Treiber
evaluation expectations (Compton et al., 2008; Fourney et al., 2011; et al., 2011).
Kaye-Tzadok & Spiro 2014; Sundar et al., 2011; Dickinson & Adams,
2012; Garca-Iriarte et al., 2011; Volkov, 2008); having adequate 3.7. ECB implementation challenges
program monitoring and information systems (Treiber et al., 2011;
Compton et al., 2008; Fourney et al., 2011; Fleming and Easton, Challenges to implementation of ECB interventions and
2010; Levine et al., 2013; McDuff, 2001; McDonald et al., 2003; activities were not always explicit in the included studies. Factors
Volkov, 2008; Satterlund et al., 2013); ensuring there is adequate mentioned as having posed as a challenge to implementing ECB
funding for evaluation in the program budget (Campbell & Longo, activities that were highlighted within this review are having: a
2002; Compton et al., 2008; Lennie, 2005; Levine et al., 2013; lack of time to evaluate (Akintobi et al., 2012; Bamberg et al., 2010;
Volkov, 2008; Preskill & Boyle, 2008a; NuMan et al., 2007); and Campbell & Longo, 2002; Dickinson & Adams, 2012; Fleming &
taking concrete steps to provide support to staff in carrying out Easton, 2010; Kaye-Tzadok & Spiro, 2014; Lennie, 2005; Levine
evaluations in whatever form that takes (expertise, supervision, et al., 2013; Compton et al., 2008; Volkov, 2008); a lack of dedicated
time, extra staff). nancial support or support generally (Campbell & Longo, 2002;
An experiential learning approach to training, by inclusion of Compton et al., 2008; Lennie, 2005; Levine et al., 2013; Volkov,
practical application of knowledge to the participants work 2008); a lack of research and evaluation infrastructure (Akintobi
situation, was seen by a number of the studies as being crucial et al., 2012; Campbell & Longo, 2002; Lennie, 2005; Levine et al.,
to effective training (Akintobi et al., 2012; Campbell & Longo, 2002; 2013); staff turnover (Akintobi et al., 2012; Compton et al., 2008;
Preskill & Boyle 2008a; NuMan et al., 2007; Kaye-Tzadok & Spiro, Levine et al., 2013; Satterlund et al., 2013); conicting agendas of
2014; Dickinson & Adams, 2012; Fleming & Easton, 2010; different stakeholder groups (Lennie, 2005; Compton et al., 2008;
McDonald et al., 2003; McDuff, 2001; Satterlund et al., 2013; Volkov, 2008); an inability to apply new knowledge to practice
Sundar et al., 2011; Volkov, 2008). (Bamberg et al., 2010; Dickinson & Adams, 2012); institutional
Provision of technical support in the workplacethrough resistance to evaluation (Fleming & Easton, 2010; Levine et al.,
mentorship, coaching and external evaluatorswas seen as a 2013; Volkov, 2008); varying levels of staff evaluation expertise
crucial element of many of the ECB strategies (Akintobi et al., 2012; (Compton et al., 2008; Satterlund et al., 2013); a lack of staff
Bamberg et al., 2010; Campbell & Longo, 2002; Compton et al., involvement or buy-in (Akintobi et al., 2012); a lack of incentive
2008; Dickinson & Adams, 2012; Fleming & Easton, 2010; Fourney and rewards for evaluation (Satterlund et al., 2013); a lack of a
et al., 2011; Higa & Brandon, 2008; Levine et al., 2013; McDonald purposeful long-term ECB plan (Volkov, 2008); local staff
et al., 2003; Preskill & Boyle, 2008a; Treiber et al., 2011; Volkov, recognition and use of ECB services (Satterlund et al., 2013); and
2008) and is an adjunct or alternative to practicing skills within a increased demand for evaluation and requests for support from
training program. Alternative or extra supports that were also other divisions and agencies once evaluation capacity improved
favored in the review included: having access to evaluation tools (McDonald et al., 2003).
such as guides, templates, program logic (Akintobi et al., 2012;
Bamberg et al., 2010; Compton et al., 2008; Fourney et al., 2011; 3.8. ECB evaluation challenges
Garca-Iriarte et al., 2011; Higa & Brandon 2008; McDonald et al.,
2003; Satterlund et al., 2013; Sundar et al., 2011; Treiber et al., Challenges to evaluating ECB activities or efforts were less
2011; Volkov, 2008); and connecting, networking and sharing dened and often not discussed. Bamberg identied that the
ideas, for example, with other people or organizations evaluating multiple elements or conditions required for a successful ECB
similar programs (Campbell & Longo, 2002; Cooke et al., 2008; effort were frequently interdependent and reliant on one another
Lennie, 2005; Fleming & Easton, 2010; Preskill & Boyle, 2008a; (Bamberg et al., 2010). This speaks to some of the difculty related
Sundar et al., 2011). All of the ECB efforts included in this review to teasing out and measuring which interventions and elements
use different combinations of these supports. Part of organization- have the most impact on ECB. Evaluating the multidimensionality
al support for evaluation, which fosters buy-in from staff, is having needed for an ECB effort are further complicated by the difculty
an organizational expectation and a mechanism to use evaluation with empirically quantifying and therefore measuring benecial
outcomes in a timely way to inform ongoing practice (Akintobi elements (Fourney et al., 2011).
et al., 2012; Fleming & Easton, 2010; Lennie, 2005; Sundar et al., A number of studies ECB efforts were incorporated within the
2011; Compton et al., 2008; Fourney et al., 2011; NuMan et al., implementation and evaluation of the program/s for which
2007; Satterlund et al., 2013; Treiber et al., 2011; Garca-Iriarte evaluation capacity is being built (Akintobi et al., 2012; Lennie,
et al., 2011; McDonald et al., 2003; Volkov, 2008). 2005; NuMan et al., 2007). As a result ECB specic evaluation was
just one part of the evaluation process and often not well dened,
3.6.3. Organizational evaluation capacity assessment or needs considered or evaluated separately from the evaluation of the
assessment actual program.
Carrying out organizational evaluation capacity assessment or According to Satterlund et al. (2013), ECB is an inexact science.
needs assessment, as either a precursor to undertaking ECB While theirs was one of the few studies that paid attention to
activities or to inform ongoing efforts in this area, is another impact assessment (that is, of a Tobacco Control Evaluation
common theme found in this review (Garca-Iriarte et al., 2011; Center), they recommend standardized pre-intervention measure-
McDuff, 2001; NuMan et al., 2007; Satterlund et al., 2013; Treiber ment and evaluation assessment including skills knowledge and
et al., 2011). Different departments in an organization may have attitude, but acknowledge that they did not carry out this pre
different baseline levels of evaluation capability and different assessment only carrying out some measurement during funding
needs for evaluation depending on the programs and projects that cycles after implementation had commenced. This is echoed by
they implement. McGeary (2009) suggests that part of a needs Akintobi et al. (2012) who also acknowledge that evaluability
assessment includes rst determining the ECB target status. A assessment gives a good baseline for the organization as it denes
more targeted approach to implementing ECB strategies, tailored what is needed and how program success would be determined
6 S. Norton et al. / Evaluation and Program Planning 58 (2016) 119

and how collected data would be used to assess program success The review studies employed a variety of methods to
(Akintobi et al., 2012). incorporate experiential approaches into ECB efforts such as,
Other challenges were related to bias that would have been training workshops that are interactive (Satterlund et al., 2013),
introduced through survey self-assessments (Kaye-Tzadok & Spiro, training developed based on workplace evaluability assessment or
2014) and also related to poor questionnaire response (Kaye- observed need (Akintobi et al., 2012; NuMan et al., 2007;
Tzadok & Spiro, 2014; Treiber et al., 2011). Avoiding interviewer Satterlund et al., 2013; Treiber et al., 2011; McDuff, 2001),
bias and to ensure content validity of interviews, required effort in incorporating within training project planning, logic model
training of the interviewer (Higa & Brandon, 2008). Others noted development and program tool development which is specic to
that evaluation ndings may not be translatable to other participants workplace programs (Dickinson & Adams, 2012;
organizations and programs (Bamberg et al., 2010) or even within Fleming & Easton, 2010), conducting a workplace-based evaluation
a similar organization if a different evaluation coordinator is used as a core element of the training course (Kaye-Tzadok & Spiro,
(Garca-Iriarte et al., 2011). There was also an acknowledgement 2014; McDuff, 2001) or offering web-based training that can
that evaluation did not always address evaluation of retention potentially be completed at the most useful time for the
(Treiber et al., 2011) or long term sustainability of an ECB effort participant in their learning process (Sundar et al., 2011). In
(Garca-Iriarte et al., 2011). essence, to be able to continue the evaluation growth journey,
experiential learning must continue beyond training.
4. Discussion Technical assistance or support was overwhelmingly recog-
nized by review studies as an important element of any ECB effort
As 81% of reviewed studies were published since 2006, (Akintobi et al., 2012; Bamberg et al., 2010; Campbell & Longo,
evaluation capacity building is a growing eld of interest. We 2002; Compton et al., 2008; Dickinson & Adams, 2012; Fourney
only reviewed 21 studies that assessed ECB interventions, with the et al., 2011; Higa & Brandon 2008; McDonald et al., 2003; Preskill &
majority of ECB papers published providing narrative accounts of Boyle, 2008a; Treiber et al., 2011; Fleming & Easton 2010; Levine
ECB efforts and conceptualizations of ECB frameworks. et al., 2013; Volkov, 2008). Assistance is crucial in promoting the
Many of the reviewed studies only used qualitative research development of the new evaluator while transitioning from the
methods to assess the impact of ECB efforts. Intervention studies classroom through to the early stages as an evaluation practitioner.
often assessed aspects such as reach and acceptability with only a It is at this point that technical assistance and support is
small number of studies using a research design that would allow particularly important to problem solve and build condence.
attribution of intervention effects, albeit weak. None of the The review studies used a variety of methods for providing
reviewed studies could establish a causal link between ECB efforts technical support including: external or internal consultancy
and more effective evaluation practice. Applying the adapted services (Akintobi et al., 2012; Dickinson & Adams, 2012; Higa &
National Health and Medical Research Councils Levels of Evidence, Brandon 2008; Preskill & Boyle 2008a); mentoring in an individual
the evidence strength of most studies was Level IV and Level V or group format with a consultant (Bamberg et al., 2010). Others
suggesting the eld can strengthen methods used to assess ECB used an internal research and evaluation unit with an evaluation
efforts. Increasingly decision makers across elds are encouraged lead and a contracted consultant (Volkov, 2008) or an evaluation
to nd the highest level of evidence to answer policy questions support team which may include individual organizations
(Centre for Evidence-Based Medicine, 2009) and ECB efforts should choosing to contract out some parts of the evaluation such as
be no exception. data collection, expert mentors or by submitting work for expert
While many of the studies have lower levels of evidence, they review (McDonald et al., 2003). Other ECB efforts offered a
still offer important insights, and an understanding of how nationwide or state-based multi-modal, program-specic central
organizations have attempted to build evaluation capability and ofce which offered site visits, telephone and email contact, and
how they have addressed challenges to ECB efforts. This provides workshops and training (Compton et al., 2008; Treiber et al., 2011).
an important pool of information about perceived barriers and Other models of technical support involved improving the ability
enablers to ECB. This review highlights that a successful ECB effort to process and analyze research data by providing access to
requires a multi-dimensional approach. All reviewed studies efcient infrastructure and expert biostatistical support (Campbell
explicitly or implicitly identied this in their frameworks or ECB & Longo, 2002; Treiber et al., 2011; Compton et al., 2008; Fourney
efforts. et al., 2011; Levine et al., 2013; Fleming & Easton, 2010; McDonald
While building knowledge through training in the theory and et al., 2003; McDuff, 2001; Satterlund et al., 2013; Volkov, 2008).
methods of how to conduct evaluations is important, where there Organizational attitude and structure that promotes evaluation
are knowledge gaps in an organization, the reviewed studies and ECB was also considered an important element in providing a
suggest that taking this knowledge back to the workplace, and working environment conducive to ongoing growth in evaluation
applying it to practice is challenging. This challenge is often related learning. Vital to achieving a conducive environment, an organi-
to a lack of condence in how to apply learnt techniques to zation must provide tangible support through provision of
evaluation practice (Bamberg et al., 2010; Dickinson & Adams, sufcient funding for evaluation to be incorporated adequately
2012). into programs (Campbell & Longo, 2002; Compton et al., 2008;
The review studies suggest that experiential learning Lennie, 2005; Levine et al., 2013; Volkov, 2008) and by preventing
approaches can go some way to address these challenges (Kolb, or removing technical barriers.
2012) by offering a more holistic approach to learning. The premise Ensuring that evaluation results are actually acknowledged and
to this approach is about including concrete experiences which used to improve programs was thought to foster organizational
form the basis for observations and reections. These reections learning in the program area being evaluated which will ultimately
are assimilated and distilled into abstract concepts from which contribute to a sense of the value of evaluation, and by doing so will
new implications for action can be drawn. These implications can encourage future evaluative activity (Akintobi et al., 2012; Fleming
be actively tested and serve as guides in creating new experiences & Easton, 2010; Lennie, 2005; Sundar et al., 2011; Compton et al.,
(Kolb, 2012). The review studies have recognized this by employing 2008; Fourney et al., 2011; NuMan et al., 2007; Satterlund et al.,
multidimensional interventions so that learning goes beyond 2013; Treiber et al., 2011; Garca-Iriarte et al., 2011; McDonald
training to provide an ongoing experiential process of growth. et al., 2003; Volkov, 2008).
S. Norton et al. / Evaluation and Program Planning 58 (2016) 119 7

The issue of sustainability of evaluation capacity building a focus of future ECB research. This study highlights the
within an organization past the initial effort is an area that merits importance of using approaches like meta-narrative reviews as a
further examination. Many of the included studies built their ECB way of assessing qualitative and mixed methods studies until a
efforts over a number of years. How sustainability is best attained stronger evidence base develops in the future (Greenhalgh et al.,
in concrete terms is less apparent. Fourney et al., 2011 refer to the 2005).
need to address large scale questions like dosage and type of
intervention for ECB sustainability. This points to the need for 5. Conclusion
rigorous, long term evaluation to ascertain what works initially and
what will maintain the level of evaluation knowledge, skills, ECB is a relatively new eld, with a small number of
attitude and behaviors that actually generates evaluation activity, experimental studies identied in the review. In light of this, it
within an organization, to inform organizational policy. is difcult to draw strong conclusions about the best methods to
The review studies focused more on measuring individual and building ECB because of the lack of ability to be able to attribute the
organizational ECB capacity and staff skills rather than rigorous effects of evaluation capacity on evaluation practice. More rigorous
assessment of the impact of ECB interventions on evaluation study designs will need to be implemented in the future to build
activity. A number of the studies present a retrospective evaluation the strength of evidence in this eld of study. While existing
of ECB activities (Garca-Iriarte et al., 2011; McDonald et al., 2003; studies in ECB are characterized by lower levels of evidence, they
Preskill & Boyle, 2008a; Compton et al., 2008; Volkov, 2008). Other do suggest that the most successful approaches to ECB are likely to
studies measured whether evaluation activity actually increased be multifaceted.
within the organization as a result of ECB efforts. Activities
measured included completion of a program evaluation based on 6. Lessons learned
an evaluation plan developed in a course (Kaye-Tzadok & Spiro,
2014); increases in external presentations and publications by Common elements for successful evaluation capacity building
principal investigators (Levine et al., 2013); numbers of grant (ECB) include: a tailored strategy based on needs assessment; an
proposals submitted and funded (Campbell & Longo, 2002); organizational commitment to evaluation and ECB specically;
numbers who had used program logic, developed an evaluation and experiential learning, training with a practical element, and
plan, criteria and standards, and had completed or were some form of ongoing technical support within the workplace. ECB
undertaking an evaluation (Dickinson & Adams, 2012) and is a relatively new eld of endeavor. While existing studies in ECB
numbers of people who started and completed an evaluation are characterized by lower levels of evidence, they suggest the
and used the results to improve their program (Fleming & Easton, most successful approaches to ECB are likely to be multifaceted.
2010). Other included studies empirically measured evaluation More rigorous study designs need to be implemented in the future,
activity by counting numbers of completed evaluation reports and to build the level of evidence in this eld.
in some cases scored the quality of the reports (Fourney et al., 2011;
Satterlund et al., 2013; Treiber et al., 2011). The lack of rigorous ECB Appendix A.
impact assessment on evaluation activity has meant that
ultimately there has been less of a focus on the impact of ECB Table A1
on informing policy and practice decision making which should be
8 S. Norton et al. / Evaluation and Program Planning 58 (2016) 119

Table A1
Summary of characteristics of studies describing interventions for evaluation capacity building that have been evaluated.

Reference Study type or Model or Interventionstrategy or Target audience or setting How was ECB evaluated Success factorsstrengths and
level of framework used elements within the and the country of challengesbarriers
evidence modelframework or study occurrence
Akintobi Intervention Implementation of  A 3 partner collaborative Community based Mixed qualitative and Success factors
et al., study: Non- an evaluation was set up to identify organizations, HIV quantitative methods were
2012 randomized capacity building and respond to evalua- primary and secondary used including:  Face-to-face teaching
Level IV strategy tion capacity needs of prevention programs,  Practical hands-on learn-
evidence organizations funded by a non-prot  Semi-structured inter- ing opportunities
 An evaluability assess- organization views and document re-  (relevant to participant
ment was carried out USA view were used in the current programmatic
with program imple- evaluability assessments needs)
menters to identify pro-  Baseline, follow-up and  Electronic tools and tem-
gram scope, objectives, nal assessments of or- plates
data capabilities and ganizations evaluation  Preferred content: data
evaluation capacity capacity were performed entry training, logic
building needs using a tool called the model and qualitative
 A program logic was cross-site program as- training
developed for programs sessment survey (C-PAS)  Ongoing assessment to
 Technical assistance was which tested self- increase local buy-in
provided reported knowledge and  Employ mixed methods
 Site-visits were carried skills related to key steps for program evaluation to
out for relationship in evaluation develop effective tailored
building and document strategies
review (for the assess-  Facilitating ongoing
ment) technical support
 Evaluation capacity Challenges
building such as didactic
 Related to data collection,
presentations, training
entry and analysis
workshops, web confer-
 Staff resources and time
ences and teleconfer-
 Staff involvement and
ences were developed
buy-in
based on evaluability
assessment
 Feedback and discussion
sessions from training
were used to enhance
evaluation capabilities
and efforts

Bamberg Intervention Elys 8 Conditions Conditions required for Community health Qualitative and quantitative Success factors
et al., study: for Change model organizational change are: programs, a non-prot methods were used to
2010 Non- organization assess the strategy against  Ongoing support, men-
randomized  Dissatisfaction with the Australia (Victoria) Elys change model toring and coaching of
Level V status quo including: staff was identied as
evidence  Knowledge and skills necessary to complement
 Available resources  Semi-structured inter- research and evaluation
 Available time views training
 Rewards or incentives  Observations and record-  Tools and resources for
 Participation is expected ing of relevant meetings evaluation available via
and encouraged  Thematic analysis of data an intranet site
 Commitment by those Training participants were  Rewards and incentives
involved given a pre and post-test through organizational
 Leadership questionnaire using a 7 recognition and presen-
Using a cooperative inquiry point Likert scale tation at conferences
approach, the research and Measured:  Ongoing staff consulta-
evaluation capacity strate- tion
 Number of requests for
gies that were employed  Ongoing management
research & evaluation as-
were: support of evaluation
sistance at 2 time points
through funding, policy
 Appointment of a re- during the 2 years of data
and procedure, inclusion
search and evaluation collection
in staff position descrip-
consultant  Access rate of website
tions and orientation and
 Development of a 10 hour resources at 2 time points
in program planning,
research and evaluation
evaluation positions
training program
 Development of a site on
Challenges
the intranet dedicated to
resources for evaluation  Training was well re-
 Provision of mentoring ceived and provided the-
and coaching of staff oretical evaluation
throughout projects knowledge but lacked
 Showcasing of staff re- practical experience to
search and evaluation promote staff condence
initiatives in conducting their own
projects
 Staff time
S. Norton et al. / Evaluation and Program Planning 58 (2016) 119 9

Table A1 (Continued)
Reference Study type or Model or Interventionstrategy or Target audience or setting How was ECB evaluated Success factorsstrengths and
level of framework used elements within the and the country of challengesbarriers
evidence modelframework or study occurrence
Campbell & Intervention Increasing the Activities included: Family medicine, non-  Survey of 23 former par- Strengths
Longo, study: non- ability of general government university- ticipants including: rat-
2002 randomized practices to secure  Holding a number of driven study ing the components on a  Receiving guidance and
Level IV research funding to sessions over a 1 year USA scale of how helpful they feedback
evidence increase research period offering training in were; and several open-  Learning from the
capacity the design and writing of ended questions experiences of peers
external grant proposals;  Surveys were mailed to  Development of grant
 Support and mentoring participants over a 1 year writing skills complet-
via email, telephone, post period ing a mock study was
and in person particularly useful
 Participants were able to
. mentor and teach others
in grant writing
 Participants were able to
connect with other
researchers

Challenges

 Lack of a supportive re-


search environment (-
nancial, local mentoring,
infrastructure and ad-
ministrative)
 Lack of time
 Lack of information re.
funding opportunities

Cooke et al., Intervention The designated This initiative included Primary care (GPs and Qualitative & quantitative Success Factors
2008 study: non- research team (DRT) awarding of funding for the practice staff, allied health methods were used to
randomized approach following strategies: professionals, community evaluate the initiative by  Having team members
Level V nurses social workers and analyzing process and who had expertise in or
evidence  To enable protected time pharmacists), non- outcome indicators for 6 previous experience un-
for research by small government, university- DRTs using: dertaking research
teams of aspiring driven study  An organizational culture
researchers UK  Meeting minutes of support for research
 Inclusion of 1 novice re-  Written reports (provid-  Developing linkages and
searcher and 1 linked to ed 6 monthly by the DRT) collaborations that could
an academic institution  Notes from discussion impact sustainability
in each team (encourag- with research coordina-  Gaining skills through
ing an apprenticeship tors training
system)  Reective sessions with Barriers
A range of research capacity teams
 Not having a formulated a
building approaches were  Feedback from DRT leads
focused idea prior to re-
used: Looking at numbers or evi-
ceiving DRT funding
dence of:
 Training in research skills
 Mentorship  Projects completed
 Supervision  Peer review publications
 Partnership development  Grant applications
 Protected time from  Training undertaken
clinical work  Conference presentations
 Local dissemination

Indicators were derived


from a previously devel-
oped framework of con-
structs (Cooke, 2005)

Compton Intervention Building evaluation State tobacco control Tobacco control programs, Two qualitative approaches Success factors
et al., study: non- capacity in through programs partnered with government organizations were used to analyze 5
2008 randomized evaluation technical the Centers for Disease USA States involved in the ECB  Making quality evalua-
Level V assistance (TA) Control and Prevention to program tion routine through or-
evidence and provide evaluation TA ganizational practices e.g.
Use of an ECB through:  Analysis of the transcripts publications, support,
framework to of qualitative style tele- and tools, training
analyze this work  Direct technical assis- phone interviews  Provide ongoing mean-
(Compton, tance  An ECB checklist with ingful support with ef-
Baizerman, &  Provision of evaluation- indicators (Compton fective modes of delivery
Stockdill, 2002) related publications et al., 2002)  Infrastructure to sustain
guides and training; program evaluation and
 Supportive surveillance long-term surveillance
 Data dissemination activities
The ECB framework used for  Dedicated human and
analysis looked at: scal resources
10 S. Norton et al. / Evaluation and Program Planning 58 (2016) 119

Table A1 (Continued)
Reference Study type or Model or Interventionstrategy or Target audience or setting How was ECB evaluated Success factorsstrengths and
level of framework used elements within the and the country of challengesbarriers
evidence modelframework or study occurrence
 Using surveillance and
 Overall processes
evaluation data for pro-
 Actual practices
gram improvement and
 Occupational orientation
accountability
 practitioner role
 Site culture, structures,
through:
practices
 Core ECB themes  Involve partners and sta-
 Basic ECB concepts keholders in ECB
 Relevant ECB knowledge  Linking evaluation goals
 Basic ECB skill compe- to an ECB framework
tencies Challenges

 Inter-organizational rela-
tionships
 Staff turnover
 Varying levels of staff
evaluation expertise
 Multiple responsibilities
of staff
 Fluctuations in funds

Dickinson Intervention Theory-based The ECB initiative Public and community Mixed qualitative and Success factors
& Adams, study: non- approach Constituted: health workers, quantitative methods were
2012 randomized government organizations used to evaluate the ECB  Evaluation learning is
Level IV and  A three-day evaluation New Zealand training and support embedded in organiza-
V evidence workshop initiative including: tional philosophy and
 A Planning and Program practice
Logic 1 day workshop  Post workshop feedback  Having a number of ca-
 A Building Your Evalua- questionnaire pacity-building strategies
tion 1 day workshop; and  An online annual survey in place;
 Individual and organiza-  Face to face telephone  Training followed by
tional consultancy, men- interviews with repre- support alongside work
toring and coaching and sentatives from 8 health- practice
support based organizations that  Follow up training ses-
The evaluation incorporat- received the training sions
ed A process and outcomes  Group work in training
evaluation including using examples of current
 Trainee reactions
evaluation projects
 Trainee learning  Development of a logic
 Using program logic, de-
 Trainee behavior model
veloping an evaluation
 Development of evalua-
plan and developing data
tive rubrics (criteria and
collection tools
performance standards)
 Organizational support of
for each outcome of in-
a culture of reection and
terest
evaluation
 Sufcient resources
Barriers

 Lack of time; and


 Lack of condence to ap-
ply new knowledge to
current practice

Fleming & Intervention A web-based Through a university Environmental education, The short and long term Success factors
Easton, study: non- evaluation capacity partnership a 12-week on government and non- outcomes of the course
2010 randomized building course line course was delivered. government organizations were evaluated in 2 prior  Training that gives a basis
Level IV Included development of a USA studies. Both are for participants to evalu-
evidence framework to enable unpublished Masters ate in their workplace
evaluation of a program in theses. This publication (measureable program
the participants workplace. discusses some of the objectives, a logic model
Course content included: ndings. The information and an evaluation plan
presented indicates that and tools)
 An overview of types and qualitative and quantitative  Training in data manage-
purposes of evaluation methods were used as it ment and analysis
and development of analyzed numbers of  University training part-
evaluation questions participants who started or nerships
 Development of mea- completed an evaluation  Online forum to commu-
sureable program objec- within 1 year post course nicate with other evalua-
tives, a logic model and and used results to improve tors after training
an evaluation plan for their program.  Maintaining contact with
their program educators after course
 Development of evalua-  Organizational culture
tion instruments that values, promotes and
 Data analysis and report- uses evaluation and sup-
ing ports professional
S. Norton et al. / Evaluation and Program Planning 58 (2016) 119 11

Table A1 (Continued)
Reference Study type or Model or Interventionstrategy or Target audience or setting How was ECB evaluated Success factorsstrengths and
level of framework used elements within the and the country of challengesbarriers
evidence modelframework or study occurrence
An on line forum, a Cyber development
Caf was added. This is a Challenges
discussion board for sharing
 Institutional resistance to
ideas experiences and tools.
evaluation
 Lack of time to conduct
evaluation
 Lack of student under-
standing of evaluation as
a systematic process

Fourney Intervention Supporting theories A multisite effort to build Nutrition education, Mixed qualitative and Success Factors
et al., study: non- of the ECB effort: evaluation capacity into government organizations quantitative methods were
2011 randomized Utility-focused local programs. Strategies USA employed including:  Making evaluation part of
Level IV and evaluation and involved: Numbers completing and normal sustained activi-
V evidence Empowerment submitting a nal report ties
evaluation  Development of written Analysis of nal reports  Utility focused evaluation
materials between 2004 and 2008 to  Linking program activi-
 Training ascertain success factors ties with outcomes
 Meetings reached including:  Increased and sustained
 Looking at program in- partner participation
volvement  Sustained partner partic-  Building relationships
The tools for practice that ipation between Network and
were developed are:  Increased rigor in evalu- local project staff
ation design  Multi-strategy approach
 A workshops and one-on-
 Ability to administer  Increasing rigor in evalu-
one technical assistance
interventions and con- ation design
 An impact evaluation
duct evaluation activities  Provision of data entry
handbook
that yielded signicant and analysis templates
 A compendium of surveys
results and tools to manage their
 A nal report template
own data and statistical
 A scope of work template
ndings
 An evaluation plan tem-
 Introduction at a gradual
plate
pace (simple evaluation
 Data entry and analysis
designs step-by-step in-
templates
troduction of evaluation
tools)

Garca- Intervention A catalyst-for- Evaluation capacity is built Disabilities, Community- Qualitative methods were Success factors
Iriarte study: non- change approach through 1 key person in a based organizations, non- used to describe a case
et al., randomized position of leadership who government university- study implementing the  Organizational readiness
2011 Level V then diffuses information to driven study ECB catalyst approach to engage in ECB con-
evidence other staff in the USA using: duct an assessment
organization. This involves:  Organizational and coor-
 Direct observation dinator understanding of
 First assessing organiza-  Document review and needs, motivations and
tions readiness to engage  entry and exit interviews expectations for evalua-
in ECB activities with the program coor- tion
 University partners then dinator  A good understanding of
supportteach and col- programs to help plan
laborate with the nomi- and design evaluation
nated coordinator  Mainstreaming evalua-
throughout the phases of tion practices
conducting an evaluation Challenges
through:
 Partner collaboration
 Brainstorming meet-
throughout evaluation
ings
phases
 Training
 Multifaceted approach
 Technical assistance
including training, tech-
(e.g. data analysis, pro-
nical assistance, mentor-
gram logic develop-
ing
ment
 Developing a program
 Coaching and mentor-
logic
ing
 Using and sharing evalu-
 Developing tools i.e. a
ation results
Program Logic
 Obtaining buy-in through
 The key person (pro-
staff participation and
gram coordinator) then
awareness of need
develops a parallel
 Identifying a person with
process with staff
leadership responsibili-
ties to lead/champion
 Only 1 staff member re-
quired to engage in the
initial stage of ECB
12 S. Norton et al. / Evaluation and Program Planning 58 (2016) 119

Table A1 (Continued)
Reference Study type or Model or Interventionstrategy or Target audience or setting How was ECB evaluated Success factorsstrengths and
level of framework used elements within the and the country of challengesbarriers
evidence modelframework or study occurrence
Higa & Intervention A participatory A participatory ECB effort Schools-programs, Qualitative methods were Success factors
Brandon, study: non- evaluation capacity was applied to 2 programs Government schools employed using:
2008 randomized building model which funded a variety of USA  Workshops positively
Level V and site-managed projects at 20  A standardized interview inuenced participants
evidence A Vygotskian schools. The participatory with open-ended ques- learning but small group
approach was used ECB effort involved: tions to assess staff and individual consulta-
to judge the opinion of their evalua- tions were more effective
participatory ECB  Professional develop- tion learning throughout for those with minimal
effort ment workshops (2 x 3 the year prior knowledge
hours) to teach selected  These were examined and  A common understand-
evaluation activities to coded based on the ing of socio-cultural
school staff Vygotskian approach values and meanings for
 A written evaluation (learning is a self-regula- evaluation tasks
guide was provided tory process of struggling  Pace and length of work-
 Individual consultation with conicts between shops, readability of
whilst evaluation tasks current world views and materials, monitoring of
were being executed in new insights) progress, reminders
their projects (writing about next steps and
project descriptions, clarication of tasks
evaluation design and  Changes in participants
methods, data collection) status or role inuenced
and ongoing support evaluation value

Kaye- Intervention A course teaching Elements of the course Post graduate social Quantitative methods were Successes factors
Tzadok & study: non- evaluation include: workers, employed using:
Spiro, randomized knowledge and (not specied if  Training of 1 person with
2014 Level IV skills in an academic  It was run over a full year government or non-  Survey of participants 1 leadership potential had a
evidence setting of the Masters of Social government organizations to 4 years after the course positive effect on their
Work Israel  The survey included 21 organizations evaluation
 A central component was multiple choice questions capacity
planning and/or carrying with space also included  Organizational and man-
out an evaluation in their for responders to provide agement support for
agency examples evaluation activities
 Semester 1: students  Involvement of all rele-
produced working papers vant stakeholders
on aspects of their eval- Barriers
uation
 Lack of time to evaluate
 Semester 2: students
presented an evaluation
plan to class. Feedback
was provided by the class
and tutor

Lennie, Intervention The LEARNERS Throughout the project Communication and Qualitative methods were Strengths
2005 study: non- (Learning, trajectory the following information Technology used to evaluate ECB with
randomized Evaluation, Action were incorporated: initiatives, multi sector feedback from participants  Participatory evaluation
Level IV and and Reection for including government obtained during: increases empowerment
V evidence New Technologies,  Participatory action re- organizations rural and contributes to long-
Empowerment and search (PAR) methodolo- Australia  Focus groups term sustainability and
Rural Sustainability) gies; and  Interviews and success of programs
capacity building  Participatory evaluation  Critical reection work-  Developing learning
process (PE) planning, designing shops communities
incorporating an and executing achieved  Including stakeholder
ECB framework by: A detailed version of the groups in the project and
 Collaboration between meta-evaluation of the ca- evaluation
researchers and diverse pacity building project  Collaborative planning
community organiza- impacts was published  Building on existing
tions and groups using: separately (Lennie et al., skills, knowledge and
Ongoing discussions; 2004). The information resources
meetings; teleconfer- presented indicates that  Utilization of evaluation
ences, focus groups and quantitative methods were results
feedback also used in workshop Barriers
 An evaluation guide feedback questionnaires
 Time, energy and cost,
was published online and questionnaires com-
lack of familiarity with
and has 4 steps: plan pleted by 4 industry part-
processes, lack of access
the evaluation; involve ners
to technology, conicting
people in the evalua-
agendas of different
tion; do the evaluation;
stakeholder groups
and review the results
and make changes
 A variety of other eval-
uation information and
resources were provid-
ed in a resource kit
S. Norton et al. / Evaluation and Program Planning 58 (2016) 119 13

Table A1 (Continued)
Reference Study type or Model or Interventionstrategy or Target audience or setting How was ECB evaluated Success factorsstrengths and
level of framework used elements within the and the country of challengesbarriers
evidence modelframework or study occurrence
A framework used in the
analysis stage was based on
4 areas of empowerment:
 Social
 Technological
 Psychological
 Political

Levine Intervention A conceptual logic Three variables of success in Health services research, Mixed qualitative and Success factors
et al., study: non- model was research capacity building academic institutions, quantitative methods were
2013 randomized developed to (outcome variables) were federal government used to evaluate 2 research  Partnership development
Level IV and evaluate ECB efforts used to measure ECB: funded capacity building programs  Having supportive senior
V evidence of organizations USA using: organizational adminis-
 Research success trators
 Organizational culture  A survey  Various forms of men-
change (re. research)  Telephone interviews toring or training
 Community impact  Site visits  Obtaining biostatistical
The evaluation capacity  Grant applications data analyst support
elements measured were:  annual reports  Developing database
 Funding data les resources for use in mul-
 Contracts-grants infra-
tiple projects
structure
Surveys and interviews  Technical support or ac-
 Institutional policies and
provided both qualitative cess to research facilities
practices (general and
and quantitative data to  Securing funding
project related)
measure the evaluation ca-  Having an experienced
 Technical research sup-
pacity elements principal investigator
port
 Adminsecretarial sup-
 Accessibility to research
port
facilities, equipment and
Barriers
staff
 Institutional support for  Competing responsibili-
grant ties
 Access to concrete sup-  Funding restrictions
port (staff labor, nancial)  Technical problems with
obtaining or using re-
search data
 Project staff turnover
 Administration and bu-
reaucracy issues
 Organizational culture
issues

McDonald Intervention A 4 phase process Four phases of ECB occurred Public Sector Qualitative methods were Success factors
et al., study: non- including over a 5 year period Organizations used during phase 3
2003 randomized development of 1) Addressing a specic Based on Agricultural reection and planning  Stage, trial and grow
Level V principles and a need (18 months) Specic division case studies using evaluation. Evaluate each
evidence framework for information requirements Australia (Victoria) individual and group stage
evaluation focused improved training data including from Address supply end of
evaluation capacity in 1 daily debriefs and evaluation capability e.g.
project. questionnaires at the end of training, support, develop
A set of principles and training materials and
framework for evaluation Additional qualitative data  Address demand end e.g.
planning were developed. collected through internal policies, systems and
Principles specied: evaluation including: practices and beliefs,
values and attitudes,
 Evaluation as a change  Structured telephone management support
agent interviews with partici-  Mandating evaluation
 Planning evaluation dur- pants from training pro- later in the process
ing project development grams  Manage ECB from within
 Evaluation as utilization-  Semi-structured inter- the organization
focused views with 10 project  Practical element to
2) Experimenting with managers training using local
volunteer projects (18  Semi-structured inter- examples
months) views with 5 senior in-  Communicate the impact
vestors and 3 external of projects
 ECB information gained
experts  Develop a common eval-
was used to guide ex-
uation framework (in-
pansion
Feedback from project staff cluding program logic)
3) Mandatory evaluation
and managers was provided  Evaluation focused on
for all new projects (2
in the second phase utility
years)
 Encourage methodologi-
 University consultants cal diversity
were engaged to provide  Use short early evaluation
a training program which to inform implementa-
was eventually delivered tion
14 S. Norton et al. / Evaluation and Program Planning 58 (2016) 119

Table A1 (Continued)
Reference Study type or Model or Interventionstrategy or Target audience or setting How was ECB evaluated Success factorsstrengths and
level of framework used elements within the and the country of challengesbarriers
evidence modelframework or study occurrence
by program support staff  Reward through position
and a professional facili- change, they can become
tator evaluation champions
4) Expansion and consoli-  Engage consultants (i.e.
dation university) early in the
process to help develop
 A small team to maintain
curricula
the evaluation capability
Challenges
of the Division
 Increased demand for
evaluation and requests
for support from other
divisions and agencies

McDuff, Intervention A comprehensive, The ECB was carried out Environmental education Mixed qualitative and Success factors
2001 study: non- layered training through the following programs, conservation quantitative methods were
randomized model process. organizations (non-prot used including:  Using a participatory
Level IV and incorporating organizations) evaluation approach
V evidence principles of  A needs assessment was Kenya  Participant observation,  Conduct needs assess-
participatory conducted to inform document review and ments at the grassroots
evaluation (PE) evaluation training semi-structured inter- level to gain important
requirements views with education information
 A core group of trainers staff for the needs as-  Adapt methods in evalu-
was trained (1 week sessment ation to the local context
workshop on PE)  Pre and post testing after  Develop common indica-
 Those trainers facilitated completion of training tors and data collection
further training to through semi-structured tools within workshops
teachercoordinators of interview to measure  Involve multiple stake-
regional activities for the changes in attitude and holders
wildlife clubs (1 week knowledge  Good communication
workshop).  The same 10 questions between funding agency
 Those teachers delivered were delivered via survey and local partners
1-2 day workshops re- for shorter workshops  Comprehensive multilay-
gionally ered training with prac-
 Comprehensive training tical application
based on needs assess-  Development of an action
ment included: plan for program imple-
 Demystifying evalua- mentation during train-
tion (role play and ing
examples of evaluation  Enhancing information
in everyday life) management
 Evaluation design, data  Incentive program to en-
collection tools and courage improvement in
analysis (including de- aspects of evaluation
signing their own tools  Organizational learning:
for their area of work) individuals act as learn-
 Field experience eval- ing agents for the orga-
uating an existing pro- nization
gram
 Conducting their own
needs assessment
 Designing a monitoring
and evaluation system
for an ecology educa-
tion program (using an
evaluation matrix de-
veloped earlier in the
workshop)
 Carrying out an evalu-
ation on an ecology
education program that
was run (including
analysis of data)

NuMan Intervention A multilevel Combines strategies on an HIV prevention, Mixed qualitative and Success factors
et al., study: non- approach to ECB individual, organizational community based quantitative methods were
2007 randomized and system level integrating organization, used to evaluate strategies  Training to increase par-
Level IV and planning, monitoring using government funded effort used within the framework ticipant and organiza-
V evidence evaluation as the anchor. It USA including: tional knowledge and
involves 3 stages: skills
 A survey using a Likert  Supplement training with
 Dene and prioritize scale to assess percep- other strategies plus fol-
needs (contextual and tions of evaluation train- low-up and long-term
organizational informa- ing and objective support for greatest
tion involving direct achievement effects
S. Norton et al. / Evaluation and Program Planning 58 (2016) 119 15

Table A1 (Continued)
Reference Study type or Model or Interventionstrategy or Target audience or setting How was ECB evaluated Success factorsstrengths and
level of framework used elements within the and the country of challengesbarriers
evidence modelframework or study occurrence
service staff and leaders  Likert scale and open  Develop measurable ca-
from the organization) ended questions used to pacity building indicators
 Analyze and categorize assess individualized  Fostering an internal de-
needs:[ technical assistance and mand or motivation for
 Formal systems (infra- follow-up evaluation
structure)  Explore more complex
 Resources (knowledge levels of building organi-
and skills) zational capacity i.e.
 Informal systems and health systems and com-
organizational culture munity levels
 Develop and implement
strategies (based on
needs assessment).
Training focused on use of
evaluation activities to
plan, monitor and
improve prevention ac-
tivities using
 Active participation
 Learning through prac-
ticing
 Respect for diversity

Preskill & Intervention Leadership, culture, As core elements of Organization, many from Qualitative methods were Success factors
Boyle, study: non- systems, structures understanding successful the education sector used to evaluate the ECB
2008 randomized and communication evaluation capacity USA activities processes in  creating interest, motiva-
Level V within an building efforts, the study organizations which had tion and buy-in
evidence organization affect explored the following with been implemented within  understand the organi-
organizational organizations: the previous year (their zation (evaluation capac-
learning capacity motivations, strategies ity assessment
 ECB commitment rea- used, inuences on learning  planning for and design-
sons. Findings include: and advice) using: ing the ECB effort
accountability; funding; a  viewing ECB as a learning
commitment to learning  Semi-structured inter- process
from evaluation; improve views with evaluators  securing sufcient
external communication and clients involved in resources
re. programs; and to ECB efforts  building strong relation-
improve evaluation skills  Thematic analysis based ships
broadly on ECB literature and the  understanding that ECB
 Strategies used. Findings ECB model developed by takes time
include: involvement in Preskill et al (2008)  using participatory
the evaluation process; approaches
providing training, writ-  using an assets and
ten materials and tech- strengths-based ap-
nical assistance; proach and
developing communities  identifying champions
of practice; using appre-
ciative inquiry; technolo-
gy; internships;
mentoring; and meetings
 Perceived outcomes.
Findings include: a com-
prehensive list of knowl-
edge skills and attitudes
reached by participants

Satterlund Intervention A specialist Incorporates a utilization- Tobacco control programs, Mixed qualitative and Success factors
et al., study: non- evaluation center to focused, blended approach local government quantitative methods were
2013 randomized offer tobacco- to ECB activities and organizations used for the ongoing  TA services were ranked
Level IV and specic technical techniques. They include: USA assessment of ECB efforts. highly and well used but
V evidence assistance to local Data were captured from available services must be
programs  Individualized technical the following sources: well communicated
assistance (TA)  Final report outcomes
 Print materials (how-to  Biennial needs assess- demonstrate learning and
tips, newsletters, sample ment and satisfaction achievement of standards
reports, how-to guides surveys of all funded Challenges
and evaluation tools projects
 Low ratios of evaluation
 Interactive training  Technical assistance logs
practitioners to local
workshops and webinars and satisfaction surveys
project staff
 Training and webinars re. of TA
 Local staff recognition
nal evaluation report  Satisfaction surveys from
and use of services
writing and provision of training
 A lack of a nancial in-
how-to guides  Final report scores -
centive as a reward for
 Capturing data from a evaluated and compared
high standards in
number of sources to over 2 funding periods to
16 S. Norton et al. / Evaluation and Program Planning 58 (2016) 119

Table A1 (Continued)
Reference Study type or Model or Interventionstrategy or Target audience or setting How was ECB evaluated Success factorsstrengths and
level of framework used elements within the and the country of challengesbarriers
evidence modelframework or study occurrence
inform and build on their directly assess introduc- performing and docu-
ECB efforts tion of training re. orga- menting projects
 Comparison and assess- nizing and writing a nal  Staff turnover
ment of scores from nal evaluation report (in-
evaluation reports from cluding how to analyze
local projects data)

Sundar Intervention Evaluation training Evaluation training was Child and Youth mental Open and closed-ended Success factors
et al., study: non- through use of offered by the Centre of health services, questions were used in the
2011 randomized interactive excellence for child and government organizations following to evaluate the  Using a blended learning
Level V technology and youth Mental health. It used Canada implementation of ECB approach including Web-
evidence computer based blended learning including: training: based tools for increased
learning reach
 A consultation service for  Satisfaction survey and a  Online learning is active,
support and focus-group interview in sustainable and easily
 web-based tools which a pilot of the OLMs accessible contributing to
included:  Feedback survey of a culture of organiza-
 online learning mod- webinars using open and tional learning
ules (OLM) on plan- closed-ended questions  The webinars added to
ning, doing and using connectedness with other
evaluation approx. colleagues
time to complete is 3 Barriers (related to Web-
hours; and based technology)
 a series of webinars
 Discomfort using the
platform related to lack of
computer literacy; and
 Technical problems when
attempting to use the
resources

Treiber Intervention Provision of Provision of evaluation Tobacco control Mixed qualitative and Success factors
et al., study: non- technical assistance technical assistance Government funded quantitative methods to The academic and local
2011 randomized (TA) through an through a university-based programs, local health conduct a needs assessment program partnership:
Level IV and academic center center that supports local departments and and implementation and
V evidence project staff in all aspects of community based outcome evaluation of the  Contributed rigor to eva-
programs including organizations ECB strategy through: luations while still having
evaluation planning and USA the contextual knowl-
collecting and analyzing  TA service needs and edge required
data. Support includes: satisfaction surveys  Evaluation thinking
 Analysis of TA assistance remains current and is
 Individual technical as- request logs incorporated into training
sistance  Satisfaction surveys for  Training in the area of
 Training training in report writing report writing was need-
 Site visits  Analysis of changes in ed and important
 Regular webinars nal evaluation report  Utilization focused eval-
 How to guides scores over time (3 years) uation
 Training modules and  Data analysis skills was
publications needed and important
 The center scores and
rates nal evaluation
reports with a set criteria

Volkov, Intervention A model of Organizational ability to Non-prot organizations Qualitative methods were Success factors
.
2008 study: non- organizational build evaluation capacity is USA used to evaluate factors that
randomized features that assist driven by: affect the ECB process in an  Address all of the core
Level V evaluation capacity organization with ongoing areas presented in the
evidence building  Leadership commitment program evaluation model
to ECB including:  Management need to de-
 Clear value and use of termine priorities for
evaluation  In-depth interviews with evaluation
 Integration of evaluation ECB participants: and  Collaboration between
into work culture (buy-in  Review of relevant ECB program team and exter-
from staff, treated as a documents nal evaluators
learning mechanism not a  Enlisting external con-
control mechanism) The main ndings of this sultants
 Ensuring ECB strategy is a study are the model which  Having an organizational
philosophy of the whole was produced evaluation philosophy
organization  Having a lead evaluator in
 Ongoing staff training in the organization
evaluation  Learning evaluation by
 Resources for evaluation doing
and ECB Barriers
 Setting up a functional
 Staff resistance e.g. due to
monitoring system
increased workload
S. Norton et al. / Evaluation and Program Planning 58 (2016) 119 17

Table A1 (Continued)
Reference Study type or Model or Interventionstrategy or Target audience or setting How was ECB evaluated Success factorsstrengths and
level of framework used elements within the and the country of challengesbarriers
evidence modelframework or study occurrence
 Having internal evalua-  Insufcient leadership
tion staff and staff motivation
 Lack of long term funding
 Poor rapport between
internal evaluation per-
sonnel and foundation
leaders
 Lack of a purposeful long-
term ECB plan

Appendix B.

Table B1

Table B1
Synthesis of success factors and barriers to evaluation capacity building.

Success factors and lessons learned Bibliographic references


Training and professional development as an element of ECB (Dickinson & Adams, 2012; Cooke et al., 2008; Fleming & Easton, 2010; Fourney et al., 2011;
Garca-Iriarte et al., 2011; Higa & Brandon, 2008; Kaye-Tzadok & Spiro, 2014; Levine et al., 2013;
McDonald et al., 2003; McDuff, 2001; NuMan et al., 2007; Preskill & Boyle, 2008; Satterlund
et al., 2013; Compton et al., 2008; Sundar et al., 2011; Treiber et al., 2011; Volkov, 2008)
Support and technical assistance (Akintobi et al., 2012; Bamberg et al., 2010; Campbell & Longo, 2002; Compton et al., 2008;
Dickinson & Adams, 2012; Fourney et al., 2011; Higa & Brandon, 2008; McDonald et al., 2003;
Preskill & Boyle, 2008; Treiber et al., 2011; Fleming & Easton, 2010; Levine et al., 2013; Volkov,
2008)
Timely use of ndings to shift practice (Akintobi et al., 2012; Fleming & Easton, 2010; Lennie, 2005; Sundar et al., 2011; Compton et al.,
2008; Fourney et al., 2011; NuMan et al., 2007; Satterlund et al., 2013; Treiber et al., 2011;
Garca-Iriarte et al., 2011; McDonald et al., 2003; Volkov, 2008)
Participatory approaches to evaluation (Preskill & Boyle, 2008; Bamberg et al., 2010; Fourney et al., 2011; Satterlund et al., 2013;
Akintobi et al., 2012; McDuff, 2001; Garca-Iriarte et al., 2011; Higa & Brandon, 2008; Kaye-
Tzadok & Spiro, 2014; Sundar et al., 2011; Volkov, 2008; Lennie, 2005)
Linking training with practical application (Akintobi et al., 2012; Campbell & Longo, 2002; Preskill & Boyle, 2008; NuMan et al., 2007;
Kaye-Tzadok & Spiro, 2014; Dickinson & Adams, 2012; Fleming & Easton, 2010; McDonald et al.,
2003; McDuff, 2001; Satterlund et al., 2013; Sundar et al., 2011; Volkov, 2008)
Systems that enable program monitoring and information management (Treiber et al., 2011; Compton et al., 2008; Fourney et al., 2011; Fleming & Easton, 2010; Levine
et al., 2013; McDuff, 2001; McDonald et al., 2003; Volkov, 2008; Satterlund et al., 2013)
Access to support tools and resources (Akintobi et al., 2012; Bamberg et al., 2010; Fourney et al., 2011; McDonald et al., 2003;
Satterlund et al., 2013; Sundar et al., 2011; Treiber et al., 2011; Higa & Brandon, 2008; Compton
et al., 2008; Garca-Iriarte et al., 2011; Volkov, 2008)
Partnerships between evaluators, commissioners and key stakeholders (Akintobi et al., 2012; NuMan et al., 2007; Treiber et al., 2011; Compton et al., 2008; Lennie,
2005; Levine et al., 2013; Fleming & Easton, 2010; McDonald et al., 2003; McDuff, 2001; Preskill
& Boyle, 2008; Volkov, 2008)
Organizational culture that values, promotes and uses evaluation (Cooke et al., 2008; Compton et al., 2008; Dickinson & Adams, 2012; Fleming & Easton, 2010;
McDonald et al., 2003; Preskill & Boyle, 2008; Sundar et al., 2011; Volkov, 2008)
Embedding evaluation into routine practice (Compton et al., 2008; Fourney et al., 2011; Kaye-Tzadok & Spiro, 2014; Sundar et al., 2011;
Dickinson & Adams, 2012; Garca-Iriarte et al., 2011; Volkov, 2008)
Connecting and networking and sharing ideas (Campbell & Longo, 2002; Cooke et al., 2008; Lennie, 2005; Fleming & Easton, 2010; Preskill &
Boyle, 2008; Sundar et al., 2011)
Leadership support and recognition (Bamberg et al., 2010; Kaye-Tzadok and Spiro, 2014; Levine et al., 2013; McDonald et al., 2003;
NuMan et al., 2007; Volkov, 2008)
Tailoring the ECB strategy to the contextual background (Garca-Iriarte et al., 2011; Lennie, 2005; McDuff, 2001; NuMan et al., 2007; Preskill & Boyle,
2008; Satterlund et al., 2013; Treiber et al., 2011)
Assessing organizational needs and capacities (McDuff, 2001; NuMan et al., 2007; Treiber et al., 2011; Garca-Iriarte et al., 2011; Satterlund
et al., 2013)
Ensuring training covers key elements of evaluation practice including (Akintobi et al., 2012; Fleming & Easton, 2010; Dickinson & Adams, 2012; McDuff, 2001;
logic models, planning methods and reporting NuMan et al., 2007; Sundar et al., 2011)
Having key people as a change agents (Garca-Iriarte et al., 2011; Kaye-Tzadok & Spiro, 2014; McDuff, 2001; McDonald et al., 2003;
Preskill & Boyle, 2008; Volkov, 2008)
Accessing evaluation expertise (Cooke et al., 2008; Levine et al., 2013; Volkov, 2008)
Applying an ECB framework (Compton et al., 2008; McDonald et al., 2003; NuMan et al., 2007; Preskill and Boyle, 2008)
Viewing ECB as an incremental learning process (Fourney et al., 2011; McDonald et al., 2003; NuMan et al., 2007; Preskill and Boyle, 2008)
Reward and recognition of participation in evaluation (Bamberg et al., 2010; Higa & Brandon, 2008; McDonald et al., 2003; McDuff, 2001)
Multifaceted approach to evaluation capacity building (Garca-Iriarte et al., 2011; NuMan et al., 2007; Satterlund et al., 2013; Sundar et al., 2011)
Securing funds (NuMan et al., 2007; Preskill and Boyle, 2008; Volkov, 2008)
Barriers and Challenges
Lack of time to evaluate (Akintobi et al., 2012; Bamberg et al., 2010; Campbell & Longo, 2002; Dickinson and Adams,
2012; Fleming and Easton, 2010; Kaye-Tzadok and Spiro, 2014; Lennie, 2005; Levine et al.,
2013; Compton et al., 2008; Volkov, 2008)
18 S. Norton et al. / Evaluation and Program Planning 58 (2016) 119

Table B1 (Continued)
Success factors and lessons learned Bibliographic references
Lack of dedicated nancial support or support generally (Campbell & Longo, 2002; Compton et al., 2008; Lennie, 2005; Levine et al., 2013; Volkov, 2008)
Lack of research and evaluation infrastructure (Akintobi et al., 2012; Campbell & Longo, 2002; Lennie, 2005; Levine et al., 2013)
Staff turnover (Akintobi et al., 2012; Compton et al., 2008; Levine et al., 2013; Satterlund et al., 2013)
Conicting agendas of different stakeholder groups (Lennie, 2005; Compton et al., 2008; Volkov, 2008)
Inability to apply new knowledge to practice (Bamberg et al., 2010; Dickinson & Adams, 2012)
Institutional resistance to evaluation (Fleming & Easton, 2010; Levine et al., 2013; Volkov, 2008)
Varying levels of staff evaluation expertise (Compton et al., 2008; Satterlund et al., 2013)
Lack of staff involvement or buy-in (Akintobi et al., 2012)
A lack of incentive and rewards for evaluation (Satterlund et al., 2013)
Lack of a purposeful long-term ECB plan (Volkov, 2008)
Recognition and use of services by local staff (Satterlund et al., 2013)
Increased demand for evaluation and requests for support from other (McDonald et al., 2003)
divisions and agencies once evaluation capacity improved

References Bank http://siteresources.worldbank.org/INTLACREGTOPPOVANA/Resources/


Paper_Caroline_Heider_Evaluation_Capacity_Development_Advance.pdf
Adams, J., & Dickinson, P. (2010). Evaluation training to build capability in the Accessed 03.11.12.
community and public health workforce. American Journal of Evaluation. Higa, T. A. F., & Brandon, P. R. (2008). Participatory evaluation as seen in a Vygotskian
Akintobi, T. H., Yancey, E. M., Daniels, P., Mayberry, R. M., Jacobs, D., & Berry, J. (2012). framework. The Canadian Journal of Program Evaluation, 23, 103125.
Using evaluability assessment and evaluation capacity-building to strengthen Kaye-Tzadok, A., & Spiro, S. E. (2014). Evaluation capacity building can a classroom-
community-based prevention initiatives. Journal of Health Care for the Poor and Based course make a difference? Research on Social Work Practice
Underserved, 23, 33. 1049731514559521.
Bamberg, J., Perlesz, A., McKenzie, P., & Read, S. (2010). Utilising implementation Kolb, A. K. D. (2012). Experiential Learning Theory. In N. Seel (Ed.), Encyclopedia of
science in building research and evaluation capacity in community health. the sciences of learning (pp. 12151219).Springer.
Australian Journal of Primary Health, 16, 276283. Labin, S. N., Duffy, J. L., Meyers, D. C., Wandersman, A., & Lesesne, C. A. (2012). A
Brazil, K. (1999). A framework for developing evaluation capacity in health care research synthesis of the evaluation capacity building literature. American
settings. Leadership in Health Services, 12, 611. Journal of Evaluation, 33, 307338.
Campbell, J. D., & Longo, D. R. (2002). Building research capacity in family medicine: Lennie, J., Hearn, G. N., Simpson, L. E., Kennedy, D. S., Kimber, M. P., & Hanrahan, M. U.
evaluation of the Grant Generating Project. Journal of Family Practice, 51, 593 (2004). Building community capacity in evaluating IT projects: outcomes of the
593. LEARNERS project. Queensland University of Technology.
Carman, J. G. (2013). Evaluation in an era of accountability unexpected Lennie, J. (2005). An Evaluation Capacity-Building Process for Sustainable
Opportunitiesa reply to Jill Chouinard. American Journal of Evaluation, 34, 261 Community IT Initiatives Empowering and Disempowering Impacts. Evaluation,
265. 11, 390414.
Centre for Evidence-Based Medicine (2009). Oxford centre for evidence-Based Levine, R., Russ-Eft, D., Burling, A., Stephens, J., & Downey, J. (2013). Evaluating
MedicineLevels of evidence. Centre for Evidence-Based Medicine. http://www. health services research capacity building programs: Implications for health
cebm.net/oxford-centre-evidence-based-medicine-levels-evidence-march- services and human resource development. Evaluation and Program Planning, 37,
2009/. 111.
Compton, D., Baizerman, M., & Stockdill, S. (2002). The art, craft, and science of McDonald, B., Rogers, P., & Kefford, B. (2003). Teaching people to sh? Building the
evaluation capacity building. New Directions for Evaluation, 93, 1120. evaluation capability of public sector organizations. Evaluation, 9, 929.
Compton, D. W., MacDonald, G., Baizerman, M., Schooley, M., & Zhang, L. (2008). McDuff, M. D. (2001). Building the capacity of grassroots conservation organizations
Using evaluation capacity building (ECB) to interpret evaluation strategy and to conduct participatory evaluation. Environmental Management, 27, 715727.
practice in the United States National Tobacco Control Program (NTCP): a McGeary, J. (2009). A critique of using the Delphi technique for assessing evaluation
preliminary study. Canadian Journal of Program Evaluation, 23, 199224. capability-building needs. Evaluation Journal of Australasia, 9, 3139.
Connolly, P., & York, P. (2002). Evaluating capacity-building efforts for nonprofit NSW Department of Premiers and Cabinet (2013). NSW Government Evaluation
organizations. Paul Connolly and Peter York. http://smartong.webs.com/ Framework August 2013. Sydney: NSW Department of Premiers and Cabinet.
documents/10.1.1.197.7376.pdf. NSW Department of Premiers and Cabinet (2016). NSW Government Evaluation
Cooke, J., Nancarrow, S., Dyas, J., & Williams, M. (2008). An evaluation of the Framework January 2016. Sydney: NSW Department of Premiers and Cabinet.
Designated Research Team approach to building research capacity in primary Naccarella, L., Pirkis, J., Kohn, F., Morley, B., Burgess, P., & Blashki, G. (2007). Building
care. Bmc Family Practice, 9, 112. evaluation capacity: Denitional and practical implications from an Australian
Cooke, J. (2005). A framework to evaluate research capacity building in health care. case study. Evaluation and program planning, 30, 231236.
BM C Family Practice, 6, 44. National Health, & Medical Research Council (2000). How to use the evidence:
Cousins, B., Goh, S., Clark, S., & Lee, L. (2004). Integrating evaluative inquiry into the assessment and alication of scientific evidence. NHMRC84.
organizational culture: a review and synthesis of the knowledge base. Canadian Nielsen, S. B., Sebastian, L., & Majbritt, S. (2011). Measuring evaluation capacity
Journal of Program Evaluation, 19(2), 99141. results and implications of a Danish study. American Journal of Evaluation 2011,
Danseco, E. (2013). The 5Cs for innovating in evaluation capacity building: lessons 32(3), 324344.
from the eld. The Canadian Journal of Program Evaluation, 28, 107118. NuMan, J., King, W., Bhalakia, A., & Criss, S. (2007). A framework for building
Dickinson, P., Adams, J. (2012). Building evaluation capability in the public health organizational capacity integrating planning, monitoring, and evaluation.
workforce: are evaluation training workshops effective and what else is Journal of Public Health Management and Practice, 13, S24S32.
needed? Preskill, H., & Boyle, S. (2008). A multidisciplinary model of evaluation capacity
Fleischer, D. N., Christie, C. A., & LaVelle, K. B. (2008). Perceptions of evaluation building. American Journal of Evaluation, 29(4), 443459.
capacity building in the United States: a descriptive study of American Preskill, H., & Boyle, S. (2008a). Insights into evaluation capacity building:
Evaluation Association members. Canadian Journal of Program Evaluation, 23, motivations, strategies, outcomes, and lessons learned. Canadian Journal of
3760. Program Evaluation, 23, 147174.
Fleming, M. L., & Easton, J. (2010). Building environmental educators evaluation Roe, K., & Roe, K. (2004). Dialogue boxes: a tool for collaborative process evaluation.
capacity through distance education. Evaluation and Program Planning, 33, 172 Health promotion practice, 5, 138150.
177. Satterlund, T. D., Treiber, J., Kipke, R., Kwon, N., & Cassady, D. (2013).
Fourney, A., Gregson, J., Sugerman, S., & Bellow, A. (2011). Building evaluation Accommodating diverse clients needs in evaluation capacity building: a case
capacity in local programs for multisite nutrition education interventions. study of the Tobacco Control Evaluation Center. Evaluation and program
Journal of Nutrition Education and Behavior, 43, S130S136. planning, 36, 4955.
Garca-Iriarte, E., Suarez-Balcazar, Y., Taylor-Ritzler, T., & Luna, M. (2011). A catalyst- Surez-Herrera, J. C., Springett, J., & Kagan, C. (2009). Critical connections between
for-change approach to evaluation capacity building. American Journal of participatory evaluation, organizational learning and intentional change in
Evaluation, 32, 168182. pluralistic organizations. Evaluation, 15, 321342.
Greenhalgh, T., Robert, G., Macfarlane, F., Bateb, P., Kyriakidouc, O., & Peacock, R. Suarez-Balcazar, Y., & Taylor-Ritzler, T. (2013). Moving from science to practice in
(2005). Storylines of research in diffusion of innovation:a meta-narrative evaluation capacity building. American Journal of Evaluation
approach to systematic review. Social Science & Medicine, 61, 417430. 1098214013499440.
Heider, C. (2011). Conceptual framework for developing evaluation capacities: building Sundar, P., Kasprzak, S., Halsall, T., & Woltman, H. (2011). Using web-based
a good practice in evaluation and capacity development. Washington, DC: World technologies to increase evaluation capacity in organizations providing child
S. Norton et al. / Evaluation and Program Planning 58 (2016) 119 19

and youth mental health services. Canadian Journal of Program Evaluation, 25, Volkov, B. (2008). A bumpy journey to evaluation capacity: a case study of
91112. evaluation capacity building in a private foundation. Canadian Journal of
Treiber, J., Cassady, D., Kipke, R., Kwon, N., & Satterlund, T. (2011). Building the Program Evaluation, 23, 175197.
evaluation capacity of Californias local tobacco control programs. Health Wong, G., Greenhalgh, T., Westhorp, G., Buckingham, J., & Pawson, R. (2013).
promotion practice, 12, 118S124S. RAMESES publication standards: meta-narrative reviews. BMC Medicine, 11, .

You might also like