Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

Customer satisfaction with training programs

Martin Mulder
Head Chair Group of Educational Studies, Department of Social Sciences,
Wageningen University and Research Centre, Wageningen, The Netherlands

Keywords and impact oriented questions like: ``Do I


Training, Evaluation, 1. Introduction deliver a contribution to the development of
Training industry, Effectiveness,
Training organisations want to know my clients?'', ``Did the recipients of my
Training projects,
Customer satisfaction answers questions about the quality of their training projects become more competent for
training programs as perceived by their the performance of their increasingly
Abstract clients. If the results of such evaluations are complex tasks?'', ``Did their employability
In this contribution, a model of
disappointing, the training organisation can increase?'', and ``Do my training projects
evaluation of customer
satisfaction about training modify its policy in this respect, and if the have the necessary impact for the client
programs is described. The model results are promising, the training organisation?''. They collectively supported
is developed and implemented for
organisation can try to communicate this independent training quality research to
an association of training monitor their average quality, and to
companies. The evaluation has with prospects. Trustworthiness of the
been conducted by an evaluation results is extremely important in improve their individual quality. Quality
independent organisation to this respect. In the market of knowledge improvement of individual training
enhance the thrustworthiness of
intensive service, clients are getting more organisations was not only depending on the
the evaluation results. The model system of project evaluation. Other quality
is aimed at determining the quality and more professional and critical, and do
improvement instruments were used as well.
of training programs as perceived not anymore buy biased evaluations. Many
by project managers from the The unique characteristic of the independent
clients mistrust evaluation data that is
organisations that purchased in project evaluation was that the group of
collected and presented by the training
company training programs from training organisations received comparative
the training companies. Reliability organisations themselves.
and historical trend data about their position
research showed satisfying Under certain conditions, professional
results. The model is based on the
within the group of training organisations,
development can be better realised by a
methodology in effectiveness and the development of the quality of their
collective of training organisations than by
research, and the data was used training services as perceived by their client
to test a model of training an individual organisation. Examples of joint
organisations.
effectiveness. The results show activities of associations of training
The convincing power of the project
that this model is confirmed for organisations are: collective marketing,
two categories of projects: evaluation system that will be described in
defending interest towards the tax system, this contribution, is mainly a function of the
projects that were aimed at
achieving learning results and public administration and certifying independent character of it. An individual
changed job performance organisations, and evaluation research. training organisation would also be able to
respectively. The model does not For a group of training organisations in
fit for projects aimed at supporting
have an independent evaluation study
The Netherlands this was the reason for conducted, but in the direction of the market
organisational change. Various
questions as to the development creating an association that conducts these this inspires less confidence than an
of the evaluation model are tasks. This association, the VETRON, also independent evaluation study within the
discussed. wants to be perceived as the group of training framework of an association. An individual
organisations of excellent quality. This training organisation could claim
association has about 40 members. They are confidentiality from the research
all autonomous. Some of them employ only a organisation when certain results are
few trainers/consultants, others over 100. disappointing. They could demand
Their total joint turnover per annum is some concealment of such findings. At the level of
hundreds of millions of Euros. an association, this is practically excluded
In this contribution a system for because in that case there is no direct
independent project evaluation is described personal relationship between the interested
that has been implemented for the VETRON. directors of the training organisations and
Journal of European Industrial The member organisations face the market the researchers. For the VETRON this was
Training
25/6 [2001] 321±331
The research register for this journal is available at The current issue and full text archive of this journal is available at
# MCB University Press
[ISSN 0309-0590] http://www.mcbup.com/research_registers http://www.emerald-library.com/ft

[ 321 ]
Martin Mulder the reason to contract a university to conduct list is complete. Furthermore, the research
Customer satisfaction with independent project evaluation. organisation had the right to conduct audits
training programs In the next part of this contribution, the at training organisations. During such an
Journal of European Industrial evaluation system will be described. The audit, the invoice and project administration
Training
25/6 [2001] 321±331 third part presents the research program that could be inspected and compared with the
was related to the development of the client list submitted.
evaluation system. In the fourth part a model The training organisations needed to
is tested on training effectiveness. This test is complete forms in which several items were
part of the research program, and it was an asked for each training project: the name of
important step in the considerations about the project, the name and address of the
the further use of the system. Finally, in the client, the name and address of the training
fifth part, some implications and questions organisation and the name of the trainer/
about the evaluation system are discussed. consultant. For each evaluation project this
resulted in a file of over 3,000 projects. From
this file a random sample was drawn. At the
2. The evaluation system start, the clients were informed by the
training organisations about the evaluation.
The evaluation system comprises a The independent training evaluation is part
standardised evaluation procedure and of the official delivery specifications of the
instrument and a standard format for VETRON-organisations. Clients are
research reports for the training requested to participate in the study. Clients
organisations. get the VETRON delivery regulations when
As to the evaluation procedure, the they contract one of the member training
research unit comprises a training project organisations. But it was confirmed to all
that is carried out by the training clients that a data collection phase would
organisation that is a member of the start again, just to remind the clients that
VETRON. These are all training projects in they could expect a questionnaire. The
which a question of a client organisation is questionnaire was sent out subsequently,
the focal point. In many instances these are and after two weeks all respondents were
special projects, that are an investment risk, called by an organisation for market
of great importance from the perspective of research. The data were processed, and a
the organisation's strategy, costly and benchmark report for each training
innovative. But there are also customised organisation was produced.
projects that include a clear development The evaluation instrument consists of a
component, which is clearly limited in scope closed questionnaire that is developed in a
and objective oriented. In a number of cases pilot-project. The list is based on the
there are customised ready-made projects, literature on the evaluation methodology for
which is the case when existing programs are corporate training and development
adapted to a specific question of the client (Hamblin, 1974; Brinkerhoff, 1987; May et al.,
organisation. Standard training programs 1987; Bramley, 1991; Phillips, 1991; Basarab
with open registration are excluded from this and Root, 1992; Kirkpatrick, 1994).
evaluation. Furthermore, it is based on insights in the
The population of training projects is field of integrated and result oriented
divided into three sub populations: in- training and development (Camp et al., 1986;
company training projects; individual Gaines-Robinsohn and Robinsohn, 1989), and
training projects; and training program the training marketing literature (Gilley and
development. The last group however, is Eggland, 1989; Swanson, 1994).
small, and is excluded from this contribution. The questionnaire exists of eight blocks of
This contribution is based on three rounds of questions:
data collection. The total population of 1 general questions;
training projects in the three evaluation 2 questions about the training project;
periods comprises over 10,000 projects. The 3 questions about the objectives of the
number of evaluated projects exceeds 1,200. training project;
For each evaluation a representative 4 questions about the agreements about the
sample has been drawn. For that purpose the responsibilities for attaining the
training organisations have prepared client objectives of the training project;
lists of the training projects about which, 5 questions about the results of the training
during a specified half-year training, project;
organisations sent invoices. To guarantee 6 questions about the success attribution of
that these lists were complete, the director of the attained objectives;
the training organisation needed to sign a 7 questions about the agreements on aspects
declaration ± in which is specified that the of the training project, meeting these
[ 322 ]
Martin Mulder agreements as well as about the distinguished: projects in which the
Customer satisfaction with satisfaction regarding the aspects of the objectives are attained or not. For the first
training programs training projects; and category of projects the question is to what
Journal of European Industrial 8 questions about the contacts with level the attainment of objectives can be
Training
25/6 [2001] 321±331 representatives of the training attributed to the efforts of the training
organisation. organisation. For the second category of
projects the question is to what level the
Within the group of general questions a
training organisation is accountable for the
question is asked about the general
disappointing results.
satisfaction on the training project as a
In the seventh block of the questionnaire, a
whole. The respondent can give a score of 1
distinction is being made between phases in
(very bad) to 10 (excellent) for this question.
All intermediate values on this scale are the training project and the administrative
labelled, and the respondents are used to this handling of the project. Three phases are
rating scale. This parameter is indicated as distinguished: before, during, and after the
the total satisfaction indicator (TSI). The training. Within these phases, a number of
questions about the objectives of the training topics are distinguished. Three questions are
project include a question about the level to asked about these topics: whether or not the
which certain objectives are of importance client and training organisation made
within this project. Three categories of agreements about them, the level to which
objectives are distinguished here. These are the training organisation met the
objectives that are, respectively, aimed at: agreements, and the satisfaction about all
1 attaining learning results (knowledge, these aspects (irrespective of agreements
skills, attitudes); which were made on the topics). The topics
2 improving changed work behaviour in the about which these questions are asked, are as
work situation; and follows:
3 supporting change of the organisation. 1 Before the training:
. determining of the target group;
It is possible that in certain training projects . determining the learning needs of the
different categories of objectives are participants;
combined. The categories of objectives are . the training design.
based on the categories of training results as 2 During the training:
developed by Kirkpatrick (1994). . the materials and technical equipment;
The questions about the division of . the role of the trainer(s)/consultant(s);
responsibilities for attaining the objectives of . the duration of the training project;
the projects are related to the fact that for 3 After the training:
attaining training effects in practice, there . the way in which the training project is
are always two partners who put their effort being evaluated;
in: the training organisation and the client . coaching or application of the newly
organisation. For attaining learning results acquired knowledge and skills in the
this probably is less the case, but for workplace;
attaining changed work behaviour, and . a final report prepared by the training
certainly for supporting organisational organisation.
change, the client organisation itself has a 4 Administrative handling:
reasonable large responsibility. And the . a cancellation regulation;
training organisation can only be held . the invoicing procedures;
responsible for that part of effectiveness of . property and authors rights.
training projects for which it is contracted by
the client organisation. Following the above questions, there is also
As to the results of the training projects, one about the reason why the client may be
two variables are distinguished: the dissatisfied with certain aspects.
correspondence between the results of the Furthermore, the question is asked whether
project and the expectations (this is indicated dissatisfaction about (any of) these aspects
as ``expectation realisation''), and the level to has led to problems with the training
which the objectives of the training project organisation. If that was the case, the
are attained. As for the attainment of respondent is being asked whether these
objectives, three categories of objectives are problems are already solved, and if not,
distinguished, corresponding with three whether the respondent would appreciate it
levels of attainable results: learning results, that the research organisation would contact
work behaviour, and organisational change. the training organisation about this to solve
Regarding the questions about the the problem. If the respondent agreed with
attribution of the results of the training this, the training organisation was notified
projects, two categories of projects are about the complaint of the client
[ 323 ]
Martin Mulder organisation, and it (the training 1 Instrument development and pilot-tests;
Customer satisfaction with organisation) was requested to contact the during this period the main interest was
training programs
client about this. Furthermore, it was agreed aimed at the value of the total satisfaction
Journal of European Industrial with both parties that the research indicator.
Training
25/6 [2001] 321±331 organisation would contact the client 2 Development of the ranking lists, the
organisation again after two weeks, to learn service reliability of the training
whether the complaints were handled organisations (did they actually do what
satisfactorily. If there had been no contact they promised?), the relationships with
between the training and client the total satisfaction indicator and the
organisations in this case, or the complaint reliability of the data.
was not resolved, then the research 3 Insight into the dynamics of the total
organisation should communicate the satisfaction, and an interest in, the
complaint to the board of the association. historical average of the training
Finally, in the last block of the organisations, the variation in their
questionnaire, questions are asked about the effectiveness, and the relationships
contacts the respondent has had with between the preparation of training
representatives of the training organisation projects and their implementation on the
and how satisfied he/she is about these one hand and the effectiveness on the
contacts. The respondent is asked to mention other hand.
the names of the persons with whom he/she 4 Emphasis on increasing the absolute
has had contact. Three categories of persons response rate per training organisation.
are being distinguished in these questions: This was done also to increase the
the executives, the trainers/consultants and usability of the evaluation results by the
the administrative/supporting personnel. training organisations.
In the benchmark reports, the means of the 5 Presentation of performance of training
organisations and the means of the organisations by performance profiles
association are contrasted. The reports instead of separate indicators like the
mainly consist of a graphical representation total satisfaction indicators. Furthermore,
of the results; for each question from the interest grew in a model test on the data
questionnaire a figure is made. For reasons set and on validity research.
of clear communication of the graphics, the
At the same time there was much interest for
respective question is formulated in title of
the reliability of the data. To test that, a first
the graphic. Furthermore, executive
look was taken at the representativity of the
summaries are added and in that the design
response group. A comparison is made
of the study and the most important results
between the response group versus a non-
are given. Furthermore, ranking lists of
response group (one that was willing to
training organisations were made. For that
participate in a non-response study). It was
list the total satisfaction indicator was used.
found that ratings of training projects by the
As of the third research period two ranking
non-response group did not systematically
lists were developed. The first is based on the
deviate from the rating of training projects
results of the third research period, the
by the response group. Subsequently, a test-
second on the historical average on the total
retest reliability analysis is conducted. A
satisfaction indicator. The ranking, based on
number of respondents were asked to give a
the average total satisfaction, leads to a
second rating of the training project. It
decrease in the dynamics in the ranking list
appeared that there was a high level of
and indicates an average level of quality of
correspondence between the first and second
the training organisations.
rating by the respondents. The respondents
who participated in the test-retest reliability
analysis were also asked to nominate a
3. The research program candidate for an independent rating as
Owing to the longitudinal character of the second rater (as a second opinion about the
project evaluation, it is possible to conduct same training project). These candidates
additional research into the quality of the were approached for an inter-rater reliability
research instrumentation and into the analysis. The results indicated that second
relationships within the database. The raters do not evaluate the training projects
analysis of the database allows us to differently than the first raters (Mulder,
scrutinise the relationships between the 1996).
factors that are included in the After the reliability analysis, the interest
questionnaire. went to the considerable dynamics in the
During the research project, the following ranking lists of the training organisations.
research phases can be distinguished. There were suggestions that the small
[ 324 ]
Martin Mulder response numbers of mainly small training has in attaining learning results, changed
Customer satisfaction with organisations and the non-response caused work behaviour and organisational
training programs
the dynamics. This hypothesis, however, was change. This item exists of three partial
Journal of European Industrial incorrect as the reliability of the data was items, and accordingly three scores.
Training
25/6 [2001] 321±331 high. The answer on the question to the cause 3 Condition registration: the delivery
of the dynamics can also be sought in the conditions about which agreements have
statistical characteristics of the sample. been made between the client
Regression to the mean seems a more organisation and the training
plausible explanation for the dynamics. But organisation; this is a dichotomous item
the small differences between the means of (about the respective conditions in which
the organisations may also be an agreements are made or not), consisting of
explanation. These differences are so small, the 12 partial items within the phases
that it may actually be unjustified to use before, during and after the training and
them for ranking the training organisations. the administrative handling.
This needs to be analysed more thoroughly,
The variable ``Project implementation'' also
since this would imply that the dynamics in
comprises of three items:
the ranking has little to do with substantial
1 Total satisfaction: this is the satisfaction
differences in quality of the training
about the total project handling, the
organisations. And this would lead to the
preparation of the project, the
conclusion that developing the ranking list,
implementation of it and the follow-up
based on the total satisfaction indicator,
would not be justified. activities; it is an ordinal item with ten
Next, interest grew for the variation in values, varying from very bad to excellent
effectiveness of the training projects. It satisfaction about the total handling of the
appeared that there was considerable training project.
variation on the questions about the results 2 Condition-realisation consistency: the
of the training projects. The effect delivery reliability of the training
parameters received more attention by that. organisation; as well as the item
It was demonstrated that there was a ``condition registration'', this item
significant relationship between the total consists of 12 partial items that
satisfaction and the attained learning results, correspond with the aspects about which
also the changed work behaviour and, to a client organisations have made
lesser degree, also between the total agreements with the training
satisfaction and the organisational change, organisations; these are ordinal items
but the absolute magnitude of the with five values, varying from not to
relationships appeared to be weak (see completely sticking to agreements.
Mulder and Van Ginkel, 1995). 3 Condition-realisation satisfaction: the level
of satisfaction about the performance of
the training organisations with respect to
4. Model testing: training the possible delivery conditions; just like
effectiveness the item ``condition realisation
satisfaction'', this item consists of 12
Based on these preliminary insights, a test is partial items which in this case
performed with a model for training correspond with the aspects (or
effectiveness, in which three latent variables conditions) about which client
are distinguished: first of all, two latent organisations may have made agreements
predictor variables ``Project definition'' and with the training organisations; these are
``Project implementation'', and second, the also ordinal partial items with five values,
latent criterion variable ``Project effects''. varying from being very dissatisfied to
The variable ``Project definition''
being very satisfied with the performance
comprises of three items:
of the training organisation on the
1 Objective operationalisation: the level in
different aspects.
which the objectives of the training
project are specified; this item is of And the variable ``Project effectiveness'' also
ordinal level, has five values and the consists of three items:
extremes are ``general'' and ``specific''. 1 Expectation realisation: the level in which
2 Distribution of responsibility: the level in the project results meet the expectations
which the training organisation carries of the client organisation; this is an item
responsibility for attaining the results; on ordinal level with five values, varying
this item is of ratio level, has ten values from not at all meeting and completely
and indicates the percentage of meeting the expectations of the total
responsibility the training organisation results of the training project.
[ 325 ]
Martin Mulder 2 Objective realisation: the level in which ``Project effectiveness'', and a weak
Customer satisfaction with the intended objectives of the training relationship between ``Project definition'' and
training programs ``Project effectiveness''.
project are achieved; three categories
Journal of European Industrial objectives are distinguished here: those The results from Figure 3 refer to training
Training
25/6 [2001] 321±331 aimed at reaching learning results, projects that were aimed at realising
changed work behaviour, and organisational change. The results indicate
organisational change respectively; these that this model does not fit. That means that
are three partial items at interval level, the coefficients in Figure 3 do not have
varying from having not at all to substantial meaning. The fact that this model
completely attained the project objectives. does not fit is probably caused by the kind of
3 Success attribution: the level in which the results on which these training projects are
training organisation has been aimed. Organisational change in most cases
responsible for attaining the intended is not notable until the long term.
objectives; a distinction is made here Furthermore, training organisations do have
again between the objectives at the level of less influence on achieving changes in their
learning results, work behaviour, and client organisations than on reaching
organisational change; these are also learning results. Projects that are aimed at
three partial items at interval level, achieving changed work behaviour take the
varying from being unable to attribute the middle position in this respect. Obviously,
success of the training project at all to the client organisation itself is responsible
completely attributing it to the training for the transfer of training results in job
organisation. performance. But because of performance
improvement strategies used by many client
For the model, testing three groups of organisations, and the emphasis on results
training projects are promulgated. A group of oriented training and transfer (Broad and
569 projects, which were aimed at achieving Newstrom, 1992; Swanson and Holton, 1999;
learning results, a group of 433 projects Phillips, 1994, 1997), training organisations
which were aimed at realising changes in the are asked to design training projects in such
work behaviour, and a group of 206 projects a way that they facilitate transfer of learning
which were aimed at reaching organisational results.
change. A Lisrel-analysis is performed, with For training organisations it is,
which the unweighted least squares analysis furthermore, interesting to scrutinise the
is used; this is a procedure that is suited for relationships between items and partial
not normally distributed data. Furthermore, items, to check what variables explain most
when the results of the analyses indicated of the variance in the training effectiveness.
that, error terms are correlated. These are the factors that are of importance
The results of the analyses are depicted in for managers of training organisations.
Figures 1-3. In Figure 1 the results are These factors are the most relevant for
presented of the projects that were aimed at quality management within the training
achieving learning results. Since the value of organisation.
2 has a p-value of 0.10, it can be concluded The total amount of variance in the
that the model fits. As appears from the effectiveness of training programs that is
coefficients, there is a rather strong explained by the complete model is
relationship between the latent variables reasonable, but limited. From a integrated
``Project implementation'' and ``Project conceptual perspective, this is not
effectiveness'', but a very weak relationship remarkable, since there are many other
between the latent variables ``Project factors that influence training effectiveness
definition'' and ``Project effectiveness''. Since that are not included in the evaluation
there exists a moderate relationship between system. Other factors are personal factors,
``Project definition'' and ``Project training program factors, organisational
implementation'', it can be concluded that the factors, and transfer conditions (Baldwin and
``Project definition'' is worked out in the Ford, 1988). The approach that has been used
project effects via the implementation of the during the development of the training
training project. program (Kessels, 1993) is also important, but
The same results are found for the projects not included in the evaluation system.
that were aimed at reaching changed work Including these other factors would raise the
behaviour. These results are depicted in explanatory power of the model, but decrease
Figure 2. Although the coefficients are the practical applicability of the system, as it
slightly different, the total picture is the would lead to a more complex evaluation
same: the model fits, and there is a relatively instrument, certainly when the instrument
strong relationship between the latent would be adapted to contextual variations in
variables ``Project implementation'' and training purposes and content.
[ 326 ]
Martin Mulder Figure 1
Customer satisfaction with Lisrel model for projects aimed at achieving learning results
training programs
Journal of European Industrial
Training
25/6 [2001] 321±331

Figure 2
Lisrel model for projects aimed at achieving change in job performance

At first, it can be questioned in which way


5. Discussion training organisations can make better use of
To conclude, a number of discussion issues the research results. Although the evaluation
will be raised. These issues relate to the system is meant as an instrument for quality
present evaluation system, and the present management at association level, there is a
state-of-affairs within the research program. strong need within the individual training
[ 327 ]
Martin Mulder Figure 3
Customer satisfaction with Lisrel model for projects aimed at achieving organisational change
training programs
Journal of European Industrial
Training
25/6 [2001] 321±331

organisations to use the research data at evaluate the training project differently in
organisational level. That is possible to a the future. The implications of the changed
certain level, since the training perspective for the respondent should be
organisations receive research reports in reviewed before deciding definitely to make
which their own results are contrasted with the data public to the training organisations.
the average results of the association. The Second, questions whether a standard for a
average of the association serves as a required level of quality should be
reference criterion for the quality policy of introduced within the association. Up to now,
the individual training organisations. The only the magnitude of the total satisfaction,
most direct feedback, however, can be the attainment of the intended objectives,
obtained by providing the training and the client satisfaction about various
organisations with the primary data for the aspects of the training programs are
evaluated training projects. That would observed. No standards are set for the
imply, however, that the anonymous required level of quality. It is conceivable to
character of the present evaluation develop a score profile which exists of the
procedure has to be broken. That necessitates values on the most crucial variables. For this
at least that respondents are being asked to profile standards can be determined.
allow that the data are passed on to the Examples of variables which are relevant in
training organisations. To anticipate this this respect are the total satisfaction
change, a question in this sense is already indicator, expectation realisation, objective
inserted in the second, slightly changed, realisation, success attribution, condition
questionnaire. The result of which was that realisation consistency, condition realisation
only 5 per cent of the respondents would satisfaction, and satisfaction about the
refuse this. But, this group cannot be contacts with employees of the training
neglected, and from an integrity perspective organisation. In view of the present research
it would be unethical to still send the data of results an average of 7.0 (on a ten-point scale,
these respondents to the training with ten being the maximum degree of
organisations. In any case, the respondents quality) on the profile seems to be a
who reject the proposition to pass their data reasonable standard for the minimal degree
on to the training organisations have to be of required quality. The values on the five-
respected. If, based on the results of the point scales need to be multiplied with a
respective question, it is decided to pass the factor two to create comparable scales, when
data set on to the training organisation, it this standard will be agreed. Furthermore, it
should be noted that respondents may start to could be agreed that a training organisation
[ 328 ]
Martin Mulder should not have a 5.0 or less on a variable in organisation in a specified half year. But it
Customer satisfaction with the profile, and for instance two sixes, only if happens that sometimes training
training programs they are compensated by higher scores, so organisations conduct big training projects
Journal of European Industrial that the total average remains 7.0 or higher. that take longer than half a year, and about
Training
25/6 [2001] 321±331 At the level of association management, which clients receive invoices more than
meetings should be held with those once. An invoice is being sent about a part of
organisations that do not meet the standard, the training project. The intention, however,
to analyse possible causes of the low rates. is to evaluate the complete training project at
Such meetings could eventually lead to once. The question then is whether the
further agreements with the training financial administration is the best starting
organisation about measures that could lead point for composing the client lists. Possibly,
to better performance during the next the present client lists contain a number of
evaluation. training activities that are part of a larger
Furthermore, there are various research training project. The project administration
technical issues that need to be reflected on would perhaps be a better starting point. The
in optimising the evaluation system. For disadvantage of using project
instance, it may be necessary to make a administrations however, is, that these vary
distinction between large and small training considerably by training organisation, so
organisations. As has been noted already, the that the verification of the completeness of
number of clients of training organisations the client lists is impeded, if not made
varies significantly. Some of the large impossible. That is different for the financial
training organisations have objections administration. This needs to comply with
against the fact that their evaluation results legal requirements, which make it possible to
are taken, together with those of the small verify the completeness of the client lists
organisations. Consultants of small conclusively. It will be necessary to analyse
organisations do have a much more intensive to what degree the present client lists contain
and personal relationships with their clients, training activities that actually are part of
and they expect that these clients evaluate larger projects. This analysis will reveal the
these consultants much more positively. The level to which the evaluations, that have been
personal contact would have a strong carried out already, contain training
influence on the evaluation of the quality of activities that should have been evaluated at
the training project. It is not sure, however a higher aggregation level: the complete
whether this is an undesired rating effect. If training project level.
clients, who evaluate training projects of In the future, the training organisations
small training organisations, are more can be asked to indicate whether the projects
positive than those who evaluate projects of that are invoiced, and included in the client
large organisations, the question can also be lists, are complete projects, or whether they
whether the quality of the small training are in fact a part of a larger project that
organisations is just better than that of the should be evaluated as a whole. For long-
large organisations (according to the term projects, the evaluation procedure
perception of the client, of course). A should perhaps be different, and training
conclusive answer to this question cannot be organisations could be asked to indicate such
given at this stage of the research program, projects in the client lists.
but it can be found when data of clients, who A next research technical issue concerns
are involved with both large and small the evaluation term. Questions whether the
training organisations, are compared. evaluation system sufficiently meets the
Despite the fact that relationships between variation in the training objectives, and the
large and small training organisations with related effects of the training projects.
their clients may or may not influence the Training projects that are aimed at reaching
rating of the training programs, it may be learning results can, in principle, result in
interesting to compare the results for effects in the short term, but the contrary is
different categories of training organisation. the case for projects that are aimed at
Size of organisation is one of the criteria on organisational change. Those projects, in
which they can be compared. But their most cases, will result in effects in the long
program portfolio could also be an term. Nevertheless, all training projects are
interesting criterion. evaluated over a fixed period. The rating of
Another research technical issue relates to the training projects, by the respondents,
the question as to whether the research unit takes place about five months after the
is covered well enough by the client list. As completion of the half year that is being
has been stated before, the client list is evaluated. The ratings are about the projects
composed of projects for which the training for which, during that half year, clients are
organisation has invoiced the client invoiced. That means that between the
[ 329 ]
Martin Mulder moment of invoicing and rating minimally planning within HRD and focused on
Customer satisfaction with five, and maximally 11, months elapsed. More strategic HRD. Gilley and Maycunich (1998)
training programs important, however, is the course of time went a step even further and stressed the
Journal of European Industrial between the actual training project and the importance of integrated strategic HRD.
Training
25/6 [2001] 321±331 rating. Yet, since the training organisations Walton (1999) moved the field from a strict
employ different terms of settlement, little performance orientation (Swanson, 1994;
can be said about that. It seems worthwhile, Rummler and Brache, 1995) to a transferable
though, to a have a closer look at the skills orientation. One step further is to
evaluation term, and to analyse whether it perceive HRD as a field that contributes to
sufficiently takes into account the variation competence development (Mulder, 2000). This
in the categories of training projects, the perspective builds on a line of research that
moment at which they are rated, and the started in the 1980s, in which basic skills
term in which effects can be expected. were seen as performance requirements
Another issue relates to the judgement of (Nijhof and Mulder, 1986, 1989). Training is
the client. On what are the judgements still an important part of HRD, as the
based? At this point of the research program, developments in the field of HRD reflect, is
there is no information about this. The first just one part, next to other ways in which
additional study will be aimed at this transferable skills or competencies can be
question. First of all, it needs to be analysed developed. Therefore, the training evaluation
whether an evaluation of the training project system described in the contribution needs to
by client organisations took place at all, and be adapted to these new developments. The
if so, at what level. Second, some criterion- basic structure however, remains applicable.
related training project evaluations will be
conducted. These evaluations will be used to References
test the concurrent validity between the Baldwin, T.T. and Ford, J.K. (1988), ``Transfer of
standardised training evaluation and the training: a review and directions for future
criterion-related training evaluation. research'', Personnel Psychology, Vol. 41,
Third, it should be analysed whether pp. 63-105.
different categories of organisations, with Basarab, D.J. and Root, D.K. (1992), The Training
different categories of clients, need to be Evaluation Process: A Practical Approach to
Evaluating Corporate Training Programs,
distinguished. From a professional
Kluwer Academic Publishers, Boston, MA.
perspective, certain organisations may have
Bramley, P. (1991), Evaluating Training
more and others may have less critical
Effectiveness: Translating Theory into
clients. If the level of expectation is not
Practice, McGraw-Hill, London.
constant, a fixed reference point is missing to Brinkerhoff, R.O. (1987), Achieving Results from
judge the quality of training projects. And Training: How to Evaluate Human Resource
then the judgement about the training project Development to Strengthen Programs and
is no solid indicator of the real quality of the Increase Impact, Jossey-Bass, San Francisco,
project. A training project of average quality, CA.
for instance, will be rated as insufficient by Broad, M.L. and Newstrom, J.W. (1992), Transfer
critical clients, and good by less critical of Training. Action-Packed Strategies to
clients. It is necessary to determine whether Ensure High Payoff from Training
this problem really exists, and if so, what Investments, Addison-Wesley, Reading, MA.
consequences this has to have for the Camp, R.R., Blanchard, P.N. and Huszczo, G.E.
evaluation system. (1986), Toward a More Organisationnally
Finally, new advancements in human Effective Training Strategy and Practice,
resource development should be included in Prentice-Hall, Englewood Cliffs, NJ.
evaluating the effectiveness of training DeSimone, R.L. and Harris, D.M. (1994), Human
programs. Training is an expensive Resource Development, 2nd ed., The Dryden
Press, Fort Worth, TX.
intervention that is being integrated in
Gaines-Robinson, D.G. and Robinson, J.C. (1989),
performance improvement strategies. Since
Training for Impact. How to Link Training to
Nadler's (1980) definition of HRD many
Business Needs and Measure the Results,
changes have been taking place. McLagan
Jossey-Bass, San Francisco, CA.
(1983; 1989) used the concept for developing Gilley, J.W. and Eggland, S.A. (1989), Principles of
competency profiles for HRD specialists. Human Resource Development, Addison-
Pace et al. (1991, pp. 6-7) conceptualised HRD Wesley, Reading, MA.
as a set of integrated roles. DeSimone and Gilley, J.W. and Maycunich, A. (1998),
Harris (1994) saw HRD as a set of activities. Strategically Integrated HRD. Partnering to
These authors defined HRD in terms of Maximize Organizational Performance,
learning experiences, professional fields, Addison-Wesley, Reading, MA.
roles and activities. Rothwell and Kazanas Hamblin, A.C. (1974), Evaluation and Control of
(1994, p. 2) applied insights from strategic Training, McGraw-Hill, New York, NY.

[ 330 ]
Martin Mulder Kessels, J.W.M. (1993), Towards Design Standards Nijhof, W.J. and Mulder, M. (1989), ``Performance
Customer satisfaction with for Curriculum Consistency in Corporate requirements analysis and determination'', in
training programs Education, Faculty of Educational Science Bainbridge, L. and Ruiz Quintanilla, S.A.
Journal of European Industrial and Technology, University of Twente, (Eds), Developing Skills with Information
Training Enschede. Technology, John Wiley and Sons, Chichester,
25/6 [2001] 321±331
Kirkpatrick, D.L. (1994), Evaluating Training pp. 131-52.
Programs: The Four Levels, Berrett-Koehler, Pace, R.W., Smith, P.C. and Mills, G.E. (1991),
San Francisco, CA. Human Resource Development: The Field,
McLagan, P.A. (1983), Models for Excellence. The Prentice-Hall, Englewood Cliffs, NJ.
Conclusions and Recommendations of the Phillips, J.J. (1991), Handbook of Training
ASTD Training and Development Competency Evaluation and Measurement Methods, 2nd
Study, American Society for Training and ed., Gulf Publishing Company, Houston, TX.
Development, Washington, DC. Phillips, J.J. (1994), Measuring Return on
McLagan, P.A. (1989), Models for HRD Practice. Investment: 18 Case Studies from the Real
The Models, American Society for Training World of Training ± Volume I, American
and Development, Alexandria, VA. Society for Training and Development,
May, L.S., Moore, C.A. and Zammit, S.J. (Eds) Alexandria, VA.
(1987), Evaluating Business and Industry Phillips, J.J. (1997), Measuring Return on
Training, Kluwer Academic Publishers, Investment: 17 Case Studies fom the Real World
Boston, MA. of Learning ± Volume II, American Society for
Mulder, M. (1996), ``Customer satisfaction and Training and Development, Alexandria, VA.
training program quality'', in Holton III, E.F. Rothwell, W.J. and Kazanas, H.C. (1994), Human
(Ed.), Academy of Human Resource Resource Development: A Strategic Approach,
Development 1996 Conference Proceedings, HRD, Amherst, MA.
AHRD, Austin, TX, pp. 688-95. Rummler, G.A. and Brache, A.P. (1995),
Mulder, M. (2000), Competentieontwikkeling in Improving Performance. How to Manage the
bedrijf en onderwijs (inaugural speech), White Space on the Organization Chart,
Wageningen University and Research Center, 2nd ed., Jossey-Bass Publishers,
Wageningen. San Francisco, CA.
Mulder, M. and van Ginkel, K. (1995), ``Quality Swanson, R.A. (1994), Analysis for Improving
evaluation of training programs'', ECER 95. Performance. Tools for Diagnosing
European Conference on Educational Organisations and Documenting Workplace
Research, University of Bath, Bath, p. 79. Expertise, Berrett-Koehler, San Francisco,
Nadler, L. (1980), Corporate Human Resources CA.
Development: A Management Tool, ASTD/Van Swanson, R.A. and Holton III, E.F. (1999), Results.
Nostrand Reinhold Company, Madison, WI/ How to Assess Performance, Learning, and
New York, NY. Perceptions in Organisations, Berrett-Koehler,
Nijhof, W.J. and Mulder, M. (Eds) (1986), San Francisco, CA.
Basisvaardigheden in het beroepsonderwijs, Walton, J. (1999), Strategic Human Resource
Stichting voor Onderzoek van het Onderwijs, Development, Pearson Education Limited,
`s-Gravenhage. Harlow.

[ 331 ]

You might also like