Professional Documents
Culture Documents
Value of Training Evaluation
Value of Training Evaluation
Value of Training Evaluation
Ordering Information
3 | ForeWorD
4 | Executive Summary
7 | Introduction
39 | References
| 1
| foreword |
Foreword | 3
| executive summary |
executive summary | 5
| introduction |
introduction | 7
Figure 1 | Please state the degree to which you agree with the
following statements on evaluation practices
(Percentage that agree or strongly agree)
Our learning evaluation techniques help us meet 41.3%
our organization’s learning goals
Responses to these three items were combined to create Another evaluation technique that has gained headway
an Evaluation Success Index (ESI), which was used to in recent years is the Brinkerhoff Success Case Method.
correlate against the specific evaluation practices. Readers That method, developed by Robert O. Brinkerhoff,
will see this index frequently cited in this study, along professor of education at Western Michigan University
with the Market Performance Index (MPI). The MPI was and author of several human resource development
determined by asking a series of questions about respon- books including The Success Case Method, entails
dent organizations’ overall market performance over the identifying successful learners and interviewing them
past five years. The results of these indices were mapped to find out what made the learning experience work
to other survey responses. for them and what they achieved as a result. It can
also be applied to unsuccessful candidates to determine
what went wrong. These case studies can then be used
Describing Learning to communicate the value of learning throughout the
Evaluation Methods organization, identify what works and what doesn’t,
and help improve learning programs.
While there are various ways to evaluate the effectiveness
of learning within an organization, most methods tend
When study respondents were asked about other types of
to fall somewhere into the range of evaluation techniques
learning evaluation techniques, their answers included a
making up the Kirkpatrick/Phillips model. That model
variety of responses, including
consists of five levels of evaluation and was developed by
Donald L. Kirkpatrick, professor emeritus at the University • Action learning projects
of Wisconsin and author of Evaluating Training Programs: • ADDIE (assess, design, develop, implement, evaluate)
The Four Levels, who developed Levels 1 through 4, and model
Jack Phillips, author of The Human Resource Scorecard, • Alignment measures
who developed Level 5. The first four levels measure
(1) the reaction of participants, (2) the level of learning • Balanced scorecards
achieved, (3) changes in learner behavior, and (4) business • Benchmarking, often against local metrics, ASTD
results derived from training. Level 5 measures the return metrics, or other industry standards
on investment (ROI) that learning programs deliver. • Bloom’s taxonomy
• Cost avoidance
• Customer/client assessments
• Employee engagement
introduction | 9
| section I |
What Evaluation Techniques Are Companies Using?
Correlations Correlations
Average with Evaluation with Market
Responses percentage Success Index Performance Index
Reactions of participants 91.6%
Evaluation of learning 80.8% ** **
Evaluation of behavior 54.6% ** *
Evaluation of results 36.9% ** *
Return on investment 17.9% **
*ANOVA analysis results show that the mean responses are significantly different at the p<.05 level.
**ANOVA analysis results show that the mean responses are significantly different at the p<.01 level. In each case,
the mean for those who answered “yes, we use this level of evaluation” was higher than for those who answered
“no, we don’t.”
75.3%
69.7%
54.0%
48.9%
49.2%
43.0%
36.7%
28.1%
27.1%
25.2%
23.8%
17.7%
14.4%
16.3%
15.4%
9.9%
7.1%
7.1%
7.0%
52.0%
49.5%
43.4%
24.7%
14.6%
13.6%
9.6%
6.1%
5.8%
No Yes
52.3% 47.7%
Figure 9 | If your organization uses Level 3 learning metrics (that is, the
evaluation of behaviors), when does it regularly take measurements?
Correlation
Percentage who use these approaches with Evaluation
Responses to a high or very high extent Success Index
Follow-up surveys of participants 31.0% 0.21**
Action planning 26.9% 0.21**
Performance records monitoring 24.3% 0.19**
Observation on the job 23.9% 0.17**
Program follow-up session 18.5% 0.25**
Follow-up surveys of participants’ supervisors 16.5% 0.17**
Interviews with participants 15.8% 0.14*
Interviews with participants’ supervisors 15.2% 0.17**
Follow-up focus groups 9.1% 0.23**
* Correlation is significant at the 0.05 level (2-tailed)
** Correlation is significant at the 0.01 level (2-tailed)
Correlation
Correlation with Market
Not Small Moderate High Very high with Evaluation Performance
Responses at all extent extent extent extent Success Index Index
For setting goals
with employees prior 26.7% 33.7% 22.3% 13.3% 4.1% .32**
to training
purchased or modified with well across functions (r=-.23**). In other words, the
more a respondent says that his or her evaluation data
evaluation usage in mind. can’t be compared easily across functions, the less likely
that respondent is to report organizational success with
overall evaluation efforts.
based on the percentage most money is being spent simply because it is done the
most. Of course, as we’ve noted elsewhere in this study,
of evaluation spending, is Level 4 and Level 5 evaluations are done much less fre-
quently and for a smaller range of programs than lower
41.8%
Small organizations
58.2%
30.9%
Mid-size organizations
69.1%
27.6%
Large organizations
72.4%
31.8
All organizations
68.2%
Correlation
Percentage who see these as barriers to with Evaluation
Responses a high or very high extent Success Index
Reaction of participants 49.8% -.38**
* Percentage point difference between the degree to which companies actually take these actions
to a high or very high extent and the degree to which they say they should do so
ness results, and provide the best bang for the buck. Recommended Actions
This study has resulted in excellent insights into the
But organizations are having a hard time determining if current state of learning evaluation. But, having gone
their programs do these things, and that makes it hard to through the findings, what are the main takeaways that
improve the learning function. There are strategies avail- learning professionals can quickly act on? Every orga-
able to achieve this goal, but the information has to be nization’s situation will be different, of course, and not
collected properly and used effectively. every practice will work equally well in all of them. Given
those caveats, below are our suggestions for those intent
The type of evaluation that is done is greatly influenced by on improving learning evaluations:
the audience for the data. Instructors are mostly focused
on employees’ reactions and the amount of learning that • Collect data that is meaningful to leaders. Otherwise,
takes place, which allow them to critique and improve they will never see the value of evaluations.
their programs. Yet business leaders are not as interested in • Where possible, standardize evaluation data across
that information. They want to know how performance, different functions within the organization to make
productivity, and other business metrics have improved as it easier to use the data effectively.
a result of the training (Berge, 2008). With so much focus • Spend more time and money evaluating behaviors
on Levels 1 and 2, it’s not surprising that many leaders do and results and less on participant reactions. At the
not buy into conventional learning evaluation. very least, don’t rely on reactions so completely.
• Give supervisors more responsibility when it comes
To obtain valuable learning metrics, one main question
to learning evaluation training. They should give
has to be answered: What is the purpose of the train-
employees opportunities to use their training, and
ing? The answer to that question may not be as simple
they should help track performance both prior to
as “to improve performance.” Training can be used for
and after the training.
any number of additional reasons, whether to change be-
haviors, to increase retention, or for motivation (Rands, • When evaluating changes in behavior, use strate-
2007). Whatever the reason, it becomes the guide by gies such as follow-up sessions, focus groups, and
which the learning is measured. participant surveys. Used with action planning and
performance monitoring, these strategies are the
There should be clear objectives and goals to be measured most highly correlated with evaluation success.
from the outset of the training program. Trying to identify
things to measure after the fact can be difficult and not
entirely effective. If evaluating at Level 3, for example, the
behaviors that are to change need to be identified and mea-
sured before and after the training. The business results
identified in Level 4 need to be measured prior to learning
to provide a baseline for comparison after the program is
Anderson, C. (2009, May). “Overcoming Analysis Paralysis.” Naughton, J. (2008, September). “IOL: Determining the
Chief Learning Officer, 54-56. Impact of Learning.” Chief Learning Officer, 32-36.
ASTD (American Society for Training & Development). Platt, G. (2008, August). “The Hard Facts About Soft
(2005, November). ASTD Benchmarking Forum 2005 Skills Measurement.” Training Journal, 53-56.
Learning Evaluation Practices Report. Alexandria, VA.
Phillips, J., P. Pulliam Phillips, and R. Stone. (2001).
Berge, Z. (2008). “Why It Is so Hard to Evaluate Train- The Human Resources Scorecard. Boston:
ing in the Workplace.” Industrial and Commercial Butterworth-Heinemann.
Training, 390-395.
Phillips, J., P. Pulliam, and W. Wurtz. (1998, May).
Brinkerhoff, R. (2006). “Increasing Impact of Training “Level 5 Evaluation: Mastering ROI.” Infoline.
Investments: An Evaluation Strategy for Building
Organizational Learning Capability.” Industrial and Radhakrishnan, M. (2008, October). “Learning
Commercial Training, 302-307. Measurements: It’s Time to Align.” Chief Learning
Officer, 36-39.
———. (2003). The Success Case Method. San Francisco:
Berrett-Koehler. Rands, A. (2007, February). “Extending the Half-Life of
Training.” Training Journal, 40-43.
———. (2005, February). “The Success Case Method:
A Strategic Evaluation Approach to Increasing the Redford, K. (2007, June). “What’s the Point of ROI?”
Value and Effect of Training.” Advances in Developing Training & Coaching Today, 12-13.
Human Resources, 86-101.
Todd, S. (2008, September). “Mission Accomplished?
Collins, M., and Hill, S. (2008, May). “Trends in Measuring Success of Corporate Universities.” Chief
European Human Capital Measurement.” workspan, Learning Officer, 38-40.
64-70.
Wilhelm, W. (2007, December). “Does Senior Leadership
Cunningham, I. (2007). “Sorting Out Evaluation of Learning Buy ROI for Learning?” Chief Learning Officer,
and Development: Making It Easier for Ourselves.” 86-88.
Development and Learning in Organizations, 4-6.
References | 39
| Appendix |
The Value of Evaluation Survey Overview
Survey Process
Target Survey Population
The target survey population of the ASTD/i4cp Value of Evaluation Survey consisted of an email list of primarily high-
level business, HR, and learning professional contacts from ASTD and i4cp. In total, 704 people responded to the survey.
Respondents represented a variety of organizational sizes and industries.
Survey Instrument
In this survey, multiple questions used the well-accepted 1–5 Likert-type scale, with a 1 rating generally designated as “not
at all” and a 5 rating as “a very high extent.” Additionally, several questions asked respondents to assign percentages (for
example, percentage of learning programs are evaluated at each level) in their organizations. There were 38 questions in
all, including those geared toward the demographics of respondents.
Some questions were open-text types, and that data does not appear below. Moreover, survey data from multiple questions
was combined in certain tables. In a few cases, low-priority data is not shown at all in this appendix.
Procedure
A link to an online survey was emailed to the target population during May 2009.
When compared to the past five years, how would you rate your company’s performance now?
Don’t know/
Responses Not applicable All-time low Worse Same Better All-time high
Revenue growth 10.3% 4.2% 28.6% 20.5% 30.8% 5.6%
Market share 13.4% 0.4% 11.6% 34.1% 36.5% 4.0%
Profitability 13.5% 2.9% 26.5% 25.0% 28.7% 3.4%
Customer satisfaction 7.3% 0.4% 4.5% 38.8% 41.5% 7.5%
Please state the degree to which you agree with the following statements.
Not Strongly Strongly
Responses applicable disagree Disagree Neutral Agree agree
We get a solid “bang for our
buck” when it comes to using 8.4% 6.8% 25.6% 33.7% 22.0% 3.6%
learning metrics.
Our learning evaluation techniques
help us meet our organization’s 5.0% 5.4% 24.4% 23.9% 34.9% 6.4%
learning goals.
Our learning evaluation techniques 5.4% 7.4% 24.0% 26.7% 30.8% 5.7%
help us meet our business goals.
Please select all of the Kirkpatrick/Phillips levels that you use to any extent in your organization.
Percentage who use the corresponding level to any extent
Small Midsize Large All
Responses organizations organizations organizations organizations
Reactions of participants (Level 1) 88.9% 89.7% 97.8% 91.6%
Evaluation of learning (Level 2) 72.6% 79.0% 90.4% 80.8%
Evaluation of behavior (Level 3) 54.7% 50.6% 63.5% 54.6%
Evaluation of results (Level 4) 40.2% 31.8% 46.6% 36.9%
Return on investment (Level 5) 26.5% 13.4% 22.5% 17.9%
None of the above 5.1% 5.1% 1.1% 4.1%
Q10: Do you use other forms of learning program evaluation aside from the five levels described in
the Kirkpatrick/Phillips model?
The Kirkpatrick/Phillips model is clearly the preferred model for learning evaluation, with 82.7 percent of respondents
reporting that they use this model exclusively. However, this data should be interpreted cautiously because responses to
subsequent questions indicate there may be greater diversity of evaluation techniques than is apparent from this question.
In particular, the results from Q12 indicate nearly half of the respondents conduct evaluation studies, so there appears to
be some confusion about what is technically included in the K/P model.
Do you use other forms of learning program evaluation aside from the five levels
described in the Kirkpatrick/Phillips model?
Responses Percentage
Yes 17.3%
No 82.7%
In instances where you use it, how much value do you think each of the following types or levels of
evaluation has for your organization?
Q12: Has your organization ever conducted an evaluation study (for example, interviews)
with successful trainees?
Responses are divided almost equally between organizations that have and have not conducted an evaluation study with
successful trainees. Almost 48 percent of survey participants say they have used this as a learning evaluation method.
Although this does not necessarily contradict responses to Q10, it does show that many companies are doing more than
just checking learning assessments. And it may reveal that there is not a clear understanding of all the techniques in the
K/P model. In a sense, the companies that are conducting evaluation interviews with successful trainees are using aversion
of the Brinkerhoff Success Case Method.
Has your organization ever conducted an evaluation study (e.g., interviews) with successful trainees?
Percentage of total
Responses respondents
Yes 47.7%
No 52.3%
While this data tells us that respondents are almost equally divided in their use of the Brinkerhoff method, it does not tell
us about the potential effectiveness of this method. To answer that question, a one-way analysis of variance (ANOVA)
was conducted. The mean scores of respondents on the Evaluation Success Index were compared for those who answered,
“Yes, we have conducted an evaluation study” to those who answered that they had not. The group that answered “yes”
had an average of 3.2 compared to the “no” group, whose average score was 2.8. This difference was significant (F=34.9,
p<.01), which indicates that those respondents who have conducted evaluation studies report greater evaluation success.
No significant difference was found between the two groups for the Market Performance Index.
In regard to those evaluation studies of successful trainees, please state whether your organization
has taken the following actions (select all that apply):
We have tracked the factors that enhance or impede business impact. 36.5%
In regard to those evaluation studies of successful trainees, please state whether your organization
has taken the following actions (select all that apply):
If your organization uses Level 3 learning metrics (that is, the evaluation of behaviors),
when does it regularly take measurements? (select all that apply)
To what extent does your organization use learning evaluation results to take each
of the following actions, and to what extent should it?
To what extent does your organization use learning evaluation results to take
each of the following actions?
To what degree does your organization use the following approaches for measuring the behavior and
the application or transfer of information (that is, Level 3 of the Kirkpatrick/Phillips model)?
To what degree does your organization use the following approaches for measuring the behavior and
the application or transfer of information (that is, Level 3 of the Kirkpatrick/Phillips model)?
To what degree does your organization use the following approaches to measure program impact
or results (that is, Level 4 of the Kirkpatrick/Phillips model)?
To what degree does your organization use the following approaches to measure program impact
or results (that is, Level 4 of the Kirkpatrick/Phillips model)?
To what extent are the following seen as barriers to the evaluation of learning in your organization?
Percentage who Correlation Correlation
see these as bar- with Evaluation with Market
riers to a high or Success Index Performance
Responses very high extent (ESI) Index (MPI)
Too difficult to isolate training’s impact 51.7% -.18** -.09*
on results versus other factors’ influence
Our LMS does not have a useful 40.8% -.21**
evaluation function
Evaluation data is not standardized 38.0% -.23**
enough to compare well across functions
It costs too much to conduct higher- 32.2% -.10**
level evaluations
Leaders generally don’t care about 24.1% -.12** -.12**
evaluation data
Evaluation data is too difficult to 18.9%
interpret for most people
What percentage of the following learning programs are evaluated at each level
of the Kirkpatrick/Phillips five-level evaluation model?
Q21: What percentage of all learning programs that are delivered using the following two learning
delivery methods are evaluated at each level of the K/P evaluation model?
What percentage of all learning programs that are delivered using the following two learning
Kirkpatrick/Phillips five-level evaluation model?
delivery methods are evaluated at each level of the
For tracking pre- and 36.5% 32.4% 19.7% 8.8% 2.6% .37**
post-training performance
Q23: Please indicate your organization’s total expenditures for employee learning in 2008.
The average expenditure across all organizations for employee learning in 2008 was $5,570,000. However, the average
amount spent differs depending on the size of the organization. In general, smaller and midsize organizations spent
significantly less than large organizations in total for employee learning in 2008.
Please indicate your organization’s total expenditures for employee learning in 2008.
Organizational Size Average expenditure
Small (fewer than 100 employees) $68,000
Midsize (from 100 to 10,000 employees) $1,942,000
Large (10,000 or more employees) $19,195,000
All organizations $5,570,000
Please indicate what percentage of your organization’s total expenditure for employee learning is
allocated to evaluation.
Q25: Of the total expenditure for learning evaluation, what percentage is spent on:
The largest percentage of the total expenses for learning evaluation is spent on internal resources (68.2 percent)
as compared with outside products/services (31.8 percent).
Of the total expenditure for learning evaluation, what percentage is spent on:
Responses Average percentage
Outside products/services, consultants, workshops,
and external training programs
Small organizations 41.8%
Midsize organizations 30.9%
Large organizations 27.6%
All organizations 31.8%
Internal resources such as learning staff salaries, administrative
costs, development costs, and internal IT costs
Small organizations 58.2%
Midsize organizations 69.1%
Large organizations 72.4%
All organizations 68.2%
Of the total expenditure for learning evaluation, what percentage is spent on the following:
Correlation with Correlation with
Average Evaluation Success Market Performance
Responses percentage Index (ESI) Index (MPI)
Reactions of participants (Level 1) 49.8% -.38** -.12*
David Wentworth has been a senior research Mike CzarnowskY is former director of re-
analyst for the Institute for Corporate Productivity since search with the American Society for Training & Devel-
2005. David has previously worked with digital media opment (ASTD). In that capacity, Czarnowsky led the ASTD
development and delivery, and he currently researches a Research team in the strategy, planning, and execution of
variety of topics for i4cp, including performance manage- all research studies, the Workplace Learning and Perfor-
ment, workforce technology, compensation trends and mance Scorecard, and the ASTD Benchmarking Forum.
the outsourcing of human resources. Contact informa-
tion: 727.345.2226 or david.wentworth@i4cp.com.
OTHER CONTRIBUTORS
Various staff members of the Institute for Corporate
Holly B. Tompson, PhD, is a senior research Productivity provided background research, writing
analyst at i4cp. Holly has taught in the management depart-
and other support for this report. Special thanks to Greg
ments of several universities, including the University of
Pernula, who worked on the survey implementation, to
Waikato in Hamilton, New Zealand, and, most recently,
Joe Jamrog, who created the graphs, to Mindy Meisterlin,
the University of Tampa. Her research has focused on
who created tables and worked on data analysis, to Judy
work-life balance and leadership development, with an
Wall, who proofed the report. Also thanks to Mitch
emphasis on training high-potential employees to sustain
LaQuier and Ellen Serrano.
maximum success without burnout. Holly is also active
in the University of Tampa’s Executive Education program,
where she is currently a leadership and development coach.
ASTD Research jointly determined the initial
Contact information: 813.601.5638 or holly.tompson@
concept and design of the project, jointly developed the
i4cp.com.
survey instrumentation, conducted supplemental statistical
analyses of the survey data, and jointly interpreted the
Mark Vickers is the vice president of research at findings. Contact information: 703.683.8100 or
i4cp and served as editor of this report. He has authored ASTDResearch@astd.org
many reports and white papers for the institute and
worked as managing editor for the Human Resource
Institute. He is also the editor of i4cp’s TrendWatcher
and has authored and coauthored numerous periodical
articles. Contact information: 727.345.2226 or mark.
vickers@i4cp.com.
The American Society for Training & Development The Institute for Corporate Productivity
The American Society for Training & Development The Institute for Corporate Productivity (i4cp) improves
(ASTD) is the world’s largest association dedicated to corporate productivity through a combination of
workplace learning and performance professionals. research, community, tools, and technology focused on
ASTD’s members come from more than 100 countries the management of human capital. With more than 100
and connect locally in 136 U.S. chapters and 25 global leading organizations as members, including many of the
networks. Members work in thousands of organizations best known companies in the world, i4cp draws upon one
of all sizes, in government, as independent consultants, of the industry’s largest and most experienced research
and suppliers. ASTD started in 1944 when the organization teams and executives-in-residence to produce more than
held its first annual conference. ASTD has widened the 10,000 pages of rapid, reliable, and respected research
profession’s focus to link learning and performance to annually, surrounding all facets of the management of
individual and organizational results and is a sought-after people in organizations. Additionally, i4cp identifies and
voice on critical public policy issues. For more information, analyzes the upcoming major issues and future trends
visit www.astd.org. that are expected to influences workforce productivity
and provides member clients with tools and technology
to execute leading-edge strategies and “next practices”
on these issues and trends. For more information, visit
www.i4cp.com.
ISBN 978-1-60728-365-2
69500
790907 $695.00