Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

THE KODO WAY

SIX STEPS FOR EVALUATING


TRAINING EFFECTIVENESS

1
INTRO

There’s a bunch of different proposals for how to evaluate learning or the effectiveness of e.g. a training
or a development programme. At Kodo Survey we know that theory does not equal practice so we’ve
created The Kodo Way based on academic recommendations (such as Kirkpatrick, Brinkerhoff, Phillips
and Bloom) and practical experience.

The combination of the models and practice renders an effective method to evaluate the effect of training
and development and maximize ROI.

The Kodo Way is built on six steps:

Prioritize Align Design Translate Evaluate Share

2
Prioritize among your interventions
STEP
01 Identify the learning interventions, e.g. courses, e-learnings,
programmes etc., that you just have to evaluate the effectiveness of.

Align everything to business objectives


STEP Define the end results your stakeholders are looking to influence with
02 help of the interventions. Ensure an alignment all the way through the
behaviours driving those results; and finally, the learning objectives
developing those behaviours. Use Bloom’s taxonomies for this.

Design the intervention (optional)


STEP At many occasions the intervention already exists, then you simply
03 jump to the next step. But every now and then you have the luxury of
measuring effectiveness from start because the intervention itself
isn’t yet developed. If that’s the case, it’s now time to design it.

Translate the objectives and the learning material into questions


STEP Translate all learning objectives and the material into questions that
04 are reduced from biases and ensure objective reporting on both
learning and behaviour (i.e. Kirkpatrick’s levels 2 and 3).

Evaluate the results as they come in


STEP Set everything up in Kodo Survey and follow the results as they come
05 in. Take decisions to continuously improve the short- and long-term
effect of your intervention.

Share the results and learnings

STEP First of all, you should celebrate success in the team. Also share the
06 results with the stakeholders. This should be done continuously as
they come in or at the end of a program. Finally, share successful and
unsuccessful lessons learnt to those who can benefit.

3
STEP 1

PRIORITIZE

Strategic weight and volume - two obvious


prioritization criteria
STEP-BY-STEP:
Of course you’ll need to understand the purpose
of your programmes. Hopefully you had
conversations with relevant stakeholders about List all of your programmes.
how the programmes fit into the business agenda.
If you haven’t, now is the time to do so because one Evaluate strategic weight of
of the prioritization criteria is: strategic weight. the program (involve relevant
stakeholders).
Basically, you need to compare your portfolio
in terms of which programmes support the If necessary, calculate the sum
achievement of critical business objectives of direct- and indirect costs, as
or must-win battles and which are more nice well as alternative costs.
to’s. Sometimes the overarching purpose of a
programme is directly connected to strategic Decide which programmes to
goals. On other occasions it’s not that obvious measure (and perhaps which to
and needs some more work. If you can’t find the close down completely).
strategic fit of any particular programme – we’d
say you have a problem. This risks being a huge
waste of your organisation’s resources.

The second prioritization criteria we see and


recommend is usually linked to the first, although
not always: volume. Or in other words: cost.
Hopefully there are no programmes in your
portfolio that lack a connection to strategic goals.
Then you might want to look at the volumes, or the
cost of the programmes instead, when prioritizing.

Programmes with the highest strategic weight and


highest cost (including indirect and alternative
cost) should be prioritized.

4
STEP 2

ALIGN

Now that you know what to focus on, it’s time to What develops the driving behaviours?
outline the results (you should already have a clear Ajzen’s theory of planned behaviour gives us
understanding of these from step 1) and define some inspiration on how to find that out, and this
the behaviours that will drive those results. When has inspired us to use the KAIB™ model. KAIB™
the behavioural drivers are defined you should signifies: Knowledge, Attitude, Intention and
define the learning objectives. Behaviour.

Attitude

Knowledge
Behaviour

Intention

Related reading: Maximizing business


Click Here impact from training and development

5
The idea is that if you want to develop new Bloom’s taxonomies when defining behavioural
behaviours, you need to create new intentions. drivers and learning objectives. This will help you
This is the immediate effect you should be looking design appropriate learning activities for that
for with your intervention: an intention to behave particular level of learning, and it will for sure help
in a certain way in a specific, critical, situation. you when you’re about to measure effectiveness
of your learning activity.
In order to create the intentions to behave in a
certain way, you convey knowledge about e.g. The thinking behind Bloom’s taxonomies is
tools and how to use those tools. But that isn’t simple:
enough. You also have to create motivation, and
a positive attitude towards the tools. • You must remember a concept before you
can understand it.
So, knowledge and positive attitude will drive
• You must understand a concept before
intentions, and intentions will increase the
you can apply it.
likelihood of new behaviour being developed.
Certainly, there are situations and conditions that • You must have analyzed a concept before
can hinder this behaviour, but if you diligently you can evaluate it.
focus on conveying the knowledge, creating
• To create something new, you must have
attitude and shaping intention with your learning
evaluated other concepts.
interventions, your learners will have the best
chance at success after the learning intervention.

When defining the behavioural drivers and the


learning objectives you therefore need to have
the KAIB™ model in mind, but also Bloom’s
taxonomies.

How well defined behavioural drivers and


learning objectives look like
When it comes to defining behavioural drivers
and learning objectives, and then measuring
these, you want to keep in mind how learners
learn. Bloom’s taxonomies are incredibly flexible
and can be used in conjunction with most training
and development programmes. Its popularity
stems from the fact that it is highly adaptable
and versatile, making it well-suited to evaluate
training effectiveness.

The common mistake is to write “know” in the


beginning of all first level learning objectives,
“understand” in the beginning of all second
level and “apply” for all third level (i.e. for the
behavioural drivers). Instead, you should use

6
BLOOM’S SIX TAXONOMY LEVELS:

REMEMBER
Action verbs such
as ‘recognizing’, and
‘recalling’ tell the learner
01 that the learning is at the
lowest level of thinking.
UNDERSTAND

Work at this level is likely


to require actions such as
‘interpreting’, ‘exemplifying’, 02
‘classifying’, ‘summarizing’, APPLY
‘inferring’, ‘comparing’
If learners are asked to
and ‘explaining’.
‘implement’ or ‘execute’ a
task or action, they would
03 likely be working at this
ANALYZE level of thinking.

At the analytical stage,


learners are commonly
asked to ‘differentiate’, 04
‘organize’ or ‘attribute’
EVALUATE
facts, data or subject
matter. Learners working at this
high level of thinking may
be asked to ‘critique’ or
05 ‘check’ materials.
CREATE

In the revised Bloom’s


taxonomy, creating
something original 06
or substantially new
is considered to be
the highest level of
thinking. Verbs such
as ‘generate’, ‘plan’ or
‘produce’ tell learners
that they are required to Related reading: Bloom’s Taxonomy
work at this level. Levels of Learning: The Complete Post
Click Here

7
STEP-BY-STEP:

Involve the relevant


stakeholders to list the results
you’re looking to achieve with
your intervention.

Use the the KAIB™ model and


Bloom’s taxonomies to define
the behaviour and learning
objectives of your intervention.
Make sure to involve the right
subject matter experts when
you do this:

1. Define the driving


behaviours (B).

2. Define the intentions (I)


that will drive the
behaviour.

3. Define the knowledge (K)


and the attitudes (A) that
develop the intentions.

8
STEP 3

DESIGN (optional)

Now that you’ve finished the first two steps in our There’s plenty to think of when designing
6-step model for evaluating training effectiveness, learning interventions. Since this is an optional
you should have enough to start designing step that most often isn’t relevant, this section
the actual program. If the program is already simply presents a few fundamental tips from an
designed, the process through the first two steps integrative review of literature relating to training
might have given you a lot to think about. In transfer:
particular the second step, Align. Our customers
often discover, when using the Kodo Way, that • Open all learning interventions, no matter
they have stuffed their programs with too many what type they are, with the purpose and
learnings. Sometimes it also becomes clear that the goals. This will set the foundation for
the design is for taxonomy levels that are lower what to focus on.
than the learning objectives. Many times they • Make all content relevant: the learnings
decide to redesign or adjust the programmes. should be both applicable to work and
delivered in a way that clarifies their
relevance.

• Practice a lot in a safe environment,


incorporate feedback and seize
opportunities to reinforce learning. A good
rule to keep in mind is that if it’s not done
correctly in a safe environment, it will likely
not be done correctly in real life.

• Don’t sugar-coat. Be clear and concise.


Saying “it’s good to” is not as good as
saying “you have to”.

• Demonstrate what’s good and what’s not.


Just as it’s good to demonstrate positive
examples, research points out that
detailed examples of how not to act and
Related reading: Training Transfer: what happens as a result, can help cement
Click Here An Integrative Literature Review what’s been taught.

9
STEP 4

TRANSLATE

You will now be introduced to how to define we focus on Kirkpatrick’s levels 2 and 3 since level
questions that are reduced from biases and 1 has shown not have a correlation to learning
ensure objective reporting on Kirkpatrick’s levels transfer and behavioural development.
2 and 3 (i.e. learning and behaviour).

You might have plenty of interventions going on


to drive learning in your organization, but there’s
no guarantee that the learning quality is high, that
learning transfer takes place and new behaviours
are developed. Now that we’ve covered the first
three steps for evaluating training effectiveness,
it’s time to look at exactly how to measure that.

Making a web-search on how to determine the


effectiveness of your training and development
will render you a lot of inspiration on the topic.
There’s qualitative ways of doing this, such as
the success case methodology, and there’s
quantitative ways of doing it, such as through
measurements.

It’s easy enough to measure level 1. There are


some generic questions you can use in a training
effectiveness evaluation form on that level.
Most also know how to look at Kirkpatrick’s level
4. Almost everybody agrees that it’s harder to
evaluate effectiveness on Kirkpatrick’s levels 2
and 3.

Kodo Survey is all about making evaluation of


training effectiveness easy. We’re therefore Related reading: The correlation
offering an automated way to evaluate training of participant satisfaction, learning
Click Here
effectiveness with our SaaS platform. Moreover, success and learning transfer

10
In order to evaluate the effectiveness of your • For measuring intentions, you look
training, you need to evaluate knowledge gained, forward. “Imagine that you were in
changed attitude, created intentions as well as this situation [describe situation]...
developed behaviour. As you can see, the KAIB™ How would you act?”
model comes into play again. Luckily, we’ve
• For measuring behaviour, you look
already defined measurable learning objectives
backwards. “Think back the last
with Bloom’s taxonomies and KAIB™ in mind.
three months when you’ve been in
What you should think of when defining questions: this situation [describe situation]…
How did you act?”
• Rather than asking people to evaluate
Download our ebook about determining and
their own level of knowledge, a good
optimizing impact of training and development to
way to measure it, in an objective way,
get more insight and in-depth information on this
is through single-choice or multiple-
matter.
choice questions. If the learning is about
recognizing different colors, don’t ask:
Don’t worry, if you’ve decided to use Kodo
How good are you at recognizing different
Survey to automate your measurement of
colors on a scale 1-5. Instead, show them
training effectiveness, you will get on a smooth
a picture of the color blue and ask them
learning journey that will give you an in-depth
what color it is.
understanding of this. Once you get started with
• Different types of questions will be suitable this, you will get the tools and the right skills to
for different levels of taxonomies. A deal with this in an easy way.
knowledge question therefore needs to be
defined depending on the taxonomy level
of the learning objective it’s supposed to
measure.

• When you offer alternatives to your single


choice- and multiple-choice questions,
make sure they all look equally correct
for the untrained person. This is to make
it harder to guess the right answer. Also
make sure there’s always a “Don’t know”
alternative, so that you don’t force them to
guess.

• We never recommend questions to be


answered on a scale, such as Likert
5-scale. But, if you can’t avoid it, make
sure to write the questions in a way that
you avoid a cap effect.
Related reading: Determining and
• The critical incident technique is a good optimizing the impact of your training
Click Here
tool to evaluate intention and behaviour: and development

11
STEP 5

EVALUATE

Now that your training program is designed and Remember to review the results with relevant
you know how to measure it, it’s time to monitor stakeholders to make sure there’s transparency
the results as they come in. So, what happens around the metrics and to get their feedback for
now? improvement, or for celebration.

Now it’s time to measure, monitor and quickly take


action on issues so that you correct and improve
your intervention. Typically you should measure
before a program, after a program and 3-6 months
down the line. This is what is usually referred to as
pre-, post-, and job-evaluation, or test.

Check the metrics regularly to make sure


everything is top-notch: that it’s the right people
attending trainings, that the quality is high, there’s
learning transfer, new behaviours are developed
etc.

Your metrics for evaluating training


effectiveness
The metrics you choose to measure on
Kirkpatrick’s levels 2 and 3 should reflect effect
on learning, learning quality, learning application
and behavioural development.

It should also be clear what values you’re after,


so you should decide when you’ve succeeded –
what’re acceptable results and what’re not. These
values are not always the same. Our experience
is that they differ depending on if it’s technical
training or e.g. leadership development. But they
may also change over time or over cultures (if
you’re looking at a global programme).

12
Follow the results as they come in
• Identify the pre-learning level as soon as STEP-BY-STEP:
they come in so that you can adjust the
content or the design of your programme
and ensure that you use the time at hand Monitor the evaluation data
in an optimal way. frequently using our dashboard.

• Evaluate training effect of the learning


Start with a pilot to get data for
intervention straight afterwards, both in
improvement.
order to take action if some learning is still
missing, to drive learning transfer and also
Identify issues already in the
to use your data-driven insights to improve
pre-evaluation data and then
your programme.
continuously as the data
• Follow the long-term effect of your comes in.
programme, monitor learning transfer and
behavioural development. Take actions to Determine whether the learning
enable on-the-job application of learning objectives of the programme
that took place in the learning intervention were achieved.
but doesn’t seem to be transferred to the
job.
Evaluate the root-causes to why
you did or didn’t achieve your
Don’t wait to take action!
learning objectives.
Let’s say you’re about to train 2,000 employees in
order to increase productivity in a particular area. Be quick to take action based
You’ve done the pre-, post- and job-evaluation and
on the data you get from Kodo
are following your dashboard closely.
Survey - think in agile terms and
For the first couple of programmes, the dashboard work with MVP’s.
doesn’t show the results you were looking for and
expecting which gives you an indication that you Be transparent, share the
need to act. You should not wait here. You should results with stakeholders
immediately remedy based on the data-driven and agree on actions since
insights you get from Kodo Survey. If needed, you these sometimes require your
should postpone the programme for the remaining stakeholders’ involvement.
participants in order to investigate causes of
the lack of results, actions for improvement and
implementation of those actions.

This, so that you don’t waste another 1.800 (or


so) employees’ time with inadequate training.
Think agile: you should have your MVP (minimum
viable [training] programme) and don’t be hesitant Related reading: Determining and
to continuously act on the evaluation data that optimizing the impact of your training
Click Here
comes in. and development

13
STEP 6

SHARE

You’ve now been introduced to all of the steps development and how it enables organizational
leading up to the Kodo Way for evaluating success. Take the opportunity our dashboard
training effectiveness. Now it’s time to share the provides to point out the business case for
insights. The sixth step – Share, can be done in investing in L&D programs also in the future.
parallel with the fifth step – Evaluate. So, don’t be
misconceived when we call it step 6. You should also use data to investigate areas
where the programme hasn’t delivered as
expected and use your learnings for development
Tell a story with data
of programmes going forward.
As you may have understood from the fifth step
of our way to evaluate training effectiveness,
our dashboard will make it easy for you to
follow the results as they come in, monitor STEP-BY-STEP:
progress, improvement areas and success.
Our dashboards will give you a clear oversight
of the metrics that are key when evaluating Celebrate and share successes.
training effectiveness and learning impact. You
will quickly be able to take action that drives Use the dashboard to give a
quality and ensures long-term impact. You will good story about L&D but also
however want to dive deeper into the data for a what you have learned and
more thorough analysis. Meaning that you need improved during the journey.
reporting with more detailed data, perhaps on
the individual level or on a learning objective Think back and uncover any
level. Kodo Survey is of course providing this so challenges or problematic
that you can tell the whole story and identify the areas.
reasons your program was or wasn’t successful.
Perform a “lessons learned”
With the data, you can easily present your story session, document your
in a way that suits the relevant stakeholders, lessons, share them and make
internal or external. sure to take the learnings into
account for the design of future
Don’t forget to include all things successful
programmes.
in order to reinforce the value of training and

14
ABOUT KODO SURVEY

Kodo Survey enables organisations to determine


ROI of their training and development activities,
maximize effect of their investment in training
and development and cut unnecessary costs
related to wasted training and development.

Our SaaS platform provides easy-to-use learning


impact measurement, automizes sourcing of the
learning data and does the analytics for the user.

We have a vision to make it possible for all sizes


and types of organisations to evaluate their
training and development initiatives, determine
the ROI and increase their efficiency. We believe
that learning is a must in order to progress and
succeed with business and growth, and by offering
Kodo Survey we contribute to the development of
a sustainable future.

With a gathered experience of more than 50


years, from large international enterprises as well
as local companies and municipalities, it’s with
confidence that we say that we know measuring
learning impact is a challenge that traditional
systems do not solve for our customers. But we
are making that possible - in an easy way - for large
as well as small companies and consultancies.

contact@kodosurvey.com www.kodosurvey.com

You might also like