Download as pdf or txt
Download as pdf or txt
You are on page 1of 65

HANOI UNIVERSITY

FACULTY OF MANAGEMENT AND TOURISM


----------o0o----------

CAPSTONE PROJECT

about

TRAINING DEPARTMENT OF HAWEE GROUP

Measurement Process of English program for the focused group in Hawee

Major: Business Administration – BA17

Guiding Profession: Tutor Nguyen Thi Minh Hieu

Student group: Dao Minh Anh - 1704000003

Nguyen Khanh Huyen – 1704000055

Bui Thi Dieu Linh - 1704000066

Pham Ha Phuong - 1704000095

Tran Dang Cao Sang - 1704000098

Hanoi, 07th June, 2021


Table of Contents
Abstract .......................................................................................................................................... 4
1. Introduction ............................................................................................................................... 5
1.1. Background information on the Company ........................................................................... 5
1.2. Target community ................................................................................................................ 5
1.3. Rationale for project ............................................................................................................. 7
1.4. Problem statement ................................................................................................................ 8
2. Literature Review ..................................................................................................................... 9
2.1. Trainee reaction .................................................................................................................. 12
2.2. Learning outcomes ............................................................................................................. 12
2.3. Transfer of training............................................................................................................. 14
2.4. Performance Constructs ..................................................................................................... 16
2.5. Organizational Impact ........................................................................................................ 18
3. Method of inquiry ................................................................................................................... 21
4. Result of the project ................................................................................................................ 23
4.1. Inexplicit and inadequate course information provided ..................................................... 24
4.2. The change in the Training format - Confused demand..................................................... 25
Accountability ....................................................................................................................... 26
Enhance Credibility ............................................................................................................... 26
Feedback Transparency ......................................................................................................... 27
Boosts Training Effectiveness ............................................................................................... 27
4.3. Unpremeditated in controlling courses ............................................................................. 28
5. Conclusion and Action plan ................................................................................................... 29
5.1. Conclusion.......................................................................................................................... 29
5.2. Action plan ......................................................................................................................... 30
Stage 1: Plan Assessment and Evaluation methods............................................................... 30
Stage 2: Assessment Organizing ........................................................................................... 31
Stage 3: Courses evaluation Handing out .............................................................................. 31
Stage 4: Data Collection ........................................................................................................ 32
Stage 5: Analyzing and Reporting ......................................................................................... 33
Stage 6: Capacity Record and Data Storing .......................................................................... 34

2
6. Deliverable as appendix .......................................................................................................... 35
APPENDIX A: COURSE EVALUATION PROCESS ........................................................... 35
APPENDIX B: POST-COURSE EVALUATION FORM (built by 5 points evaluation
model) ........................................................................................................................................... 36
1. Post- course evaluation form for student: .......................................................................... 36
2. Post-course Evaluation form for Student after 6 months: .................................................. 41
3. Post-course Evaluation form for teachers: ......................................................................... 46
APPENDIX C: INTERVIEW TRANSCRIPTS ....................................................................... 51
1. Interview with Current student of English center .............................................................. 51
2. Interview with English Teacher ......................................................................................... 53
3. Interview with Training Staff............................................................................................. 55
APPENDIX D: CAPSTONE DELIVERABLE EVALUATION ............................................ 57
REFERENCES ............................................................................................................................ 60

3
Abstract

In this report, an evaluation procedure was developed to improve the quality of

English course which is designed for focus group in Hawee Group. This procedure,

named as English Post-Course Evaluation Procedure with the theme of

Kirkpatrick’s Training Evaluation model (1959), was based on the system

approach. The deliverables were completed after the process of gathering

information through interviews with the outsourced teacher, the head of

procurement department and junior training staff in Hawee Group and consisted of

3 separated evaluation form that is for different kind of respondents beside the

standard procedure. The evaluation made use of a multiple choice questionnaire

and open-end question which is accompanied with formal English proficiency test

in order to provide an overall view of the course. The questionnaire is tested for

different aspects and shown to be a suitable tool. The methods of evaluation in this

report can be adapted for other courses in Hawee as well. The process of

implementing the model is presented, extensive review could be conducted after

that for further improvement.

4
1. Introduction

1.1. Background information on the Company

Hawee Group was established in 2004 as a Vietnamese company working in the field of
construction industry. The company set a vision to become an international group focusing on
general contractors and manufacturing materials, equipment for the construction industry. Hawee
Group includes 4 members: Hawee ME - Mechanical and Electrical Joint Stock Company, Hawee
EN - Energy, Hawee PT - Production and Trading Joint Stock Company, Hawee Parkland. Hawee
has continuously developed its all resources such as human technique, quality, etc. Hawee sets a
high priority in applying information technology to manage across the organization. Hawee has
affirmed its position and development through its large projects and constructions and its
cooperation with many prestigious investors. The core values are Resilience - Speed - Renovation

Hawee Group currently has a great number of employees, more than 2,000 employees nationwide,
with 1500 workers and engineers, and about 500 office staff. From 2015 to 2018, the revenue
increased continuously from 1300 billion VND to more than 4000 billion VND. In terms of credit
limit, Hawee received up to 5600 billion VND from Vietnamese bank. At that time, Hawee proudly
carried 31 projects of hotel and resort at the same time. We can name some of them such as The
Melia Ho Tram, Flamingo Dai Lai Resort, Kem Beach Phu Quoc resort, Vung Tau Hotel, etc.

With the vision of being a regionally leading group and the motto of “Truly your Partner”, Hawee
has invested in research, application and development with the aim of creating social values. Until
now, Hawee has proved that it is a professional and reliable company to many Investors.

1.2. Target community

To gain the vision of achieving a prestigious and international quality corporation in the field of
M&E and Construction, the first step taken by Hawee Group was developing the specific English
programs. In there are three main training target groups which are Broad of Manager/Director,
Middle Management and planned employees, and focused class. The learning materials are
designed to be suitable for specific groups as well as the job requirement in Hawee. Especially,
the English programs for the focused groups are based on the needs of each department whenever
they have the idea of learning English in order to improve their working conditions. However, this

5
is the internal company project in order to affirm the growth and vision of Hawee in the business
environment. For that reason, going along with the observations, our team figured out an issue
related to the measuring process for the Focused class when running the project.

The sole target community that this report focuses on is the English program for the Focused
groups. To be more specific, the focus groups mentioned are all departments of the Hawee
company that are planned to learn English. There are three departments implementing the English
programs in Hawee that are the Purchase Department, IDC Tender, Engineering & Technology.

6
1.3. Rationale for project

Table 1.1. English training process

7
The general procedure of training English for a department includes ten steps. The initial step is
that the Training Center receives the demand of learning English from other departments or Board
of Management. After the initial test to evaluate an individual's English capacity, the Training
department has to plan the English course which is suitable for the demand of the Department. The
next step is to approve the training plan which belongs to the Board of managers’ task. If the plan
is approved, it is time to implement the training course for the requested department. After several
actual training periods, the Training Center will observe the class to evaluate the learning process.
The Evaluation will be the foundation for the very after reports and records. All the stages have
the procedure and formal formats except for the Evaluation.

Hawee Group is applying the PDSA Model: Plan, Do, Study and Act in the English learning
program. For this model, the Study stage, or Evaluation plays an important role in order to improve
the project’s final results. However, as the observation from working as an intern in the Training
Department, we conducted that the evaluation stage does not have the clarified procedure.
Evaluation of training effectiveness is pivotal in improving the overall organizing process of
Hawee Training Center, however, this step does not have a detailed and specific but only general
process of applying instead.

1.4. Problem statement

This proposal will discuss and figure out one of the issues when evaluating the English course for
the Focused groups in Hawee corporation. The outstanding problem is that there is no process to
assess the trainee’s capacity and especially evaluate the multiple dimensions of learning English
program in Focused class.

Specifically, it is difficult to evaluate based on timekeeping – time or number of training sections


recorded because it will not be possible to evaluate the capabilities of every individual accurately.
For the individual classes which are designed to the board of Directors as well as the middle
managers and planned employees, the measuring process can be conducted through monthly class
observations. However, for the focus class, observation does not work well because of the big scale
of learners. Hence, it is difficult to evaluate the final results individually.

8
About the assessment, each department has a department head - who knows comprehensively the
expertise as well as the job requirements in general and the English requirements for the
department's personnel in particular. It will be an advantage for the English Center to have a great
available source of reference and training content consultant. Therefore, the development of
assessment content with such expert should also be standardized into a process to have more
effective coordination

The evaluation form is also a very important part for the training center to have data - based on
that the English center can find out the satisfaction of the parties with the course organization, the
training content or the results in the application of knowledge in trainee’ job… gathered from the
opinions of the target subjects involved in the course. It is also the source of ideas in building better
training courses later.

For that reason, it is necessary to have a process for more effective assessment and evaluation.

2. Literature Review

There have been at least four major movements that have aVected jobs over the last 30 to 40 years
(Ilgen and Pulakos 1999). First, there has been a rush toward more highly technical and
sophisticated systems in the workplace. For example, advanced manufacturing techniques now
permit the tailoring of products to the needs of the customer (Wall and Jackson 1995). This shift
has placed a great emphasis on increased knowledge and skill requirements for most jobs. Second,
there has been a shift from manufacturing to service jobs (Howard 1995). These jobs are
characterized by an increase in the importance of people skills as in working with customers and
clients rather than interacting primarily with co-workers and things (Goldstein and Ford 2002).
Third, organizations have become more lean with the dual objectives of cutting costs while at the
same time improving productivity in order to meet competitive pressures. The move to lean has
resulted in broader responsibilities for workers, and emphasis on teamwork, and an enhanced role
for eVective leadership (Cutcher-Gershenfeld and Ford 2005). Finally, there is clearly a movement
to a more global market and society where jobs and organizations need to be increasingly Xuid.
The globalization of industry has led to increased importance to project teams that may span
diVerent organizations and workers from diVerent countries to produce a single Wnal product
(Friedman 2006)

9
Our modern world has undergone massive changes which has helped in shaping the workforce in
the past 40 years (Ilgen and Pulakos 1999; Thayer 1997). The first drastic change was a shift to
further requirement in knowledge and skills in any position which resulted from the transition to a
more technical advanced system. (Wall and Jackson 1995). The second wave of drastic change
crashed during the trend of dual objectives which are reducing costs and increasing efficiency. As
the result, this required worker to be have broader responsibilities for workers and emphasis on
teamwork, and an enhanced role for effective leadership (Cutcher-Gershenfeld and Ford 2005). At
last, the globalization of industry has increased the importance of partnership and collaboration
that promotes the most efficient manufacturers around the world in a collective effort to produce
a single product (Friedman 2005).

Beside the soft skills and technical skills that needed to be developed to follow the prevailing
trends of globalization, English takes on the crucial role as a communication medium while being
one of the most learned languages in the world. According to researches, English language has had
a significant impact on the economic sector's growth and progress. (Johnson, 2009). For example,
it teaches people fundamental skills that they need to adapt with the new technological era. So, the
learners of English as a foreign language are encouraged to learn English in order to promote their
learning abilities in the fields of science and technology, IT, engineering, medicine, law, business,
tourism, and so on. Furthermore, most of the companies are dealing with the international
companies since English serves the purposes of the multi-national companies’ needs and it is being
used as a tool of communication between one business organization and the other. Therefore, the
company place a great emphasis on providing English course for its employee in order to cope
with this rise in demand. Along with the English course offer, there is also a need for course
evaluation to keep things moving in the right direction.

Our project is aimed to design an evaluation process for English training course based on 5
different aspects: trainee’s satisfaction with the program, knowledge or abilities gained, use of new
skills and behavior on the job (transfer of training), and improvements in individual and
organizational performance. This was introduced in Fundamentals of Human Resource
Management (2016) book in chapter 7: Training Employees by Noe et al. It is suggested that the
evaluation process should be prepare at the time the course being developed which consisted of
objectives and content. The purpose of evaluating training is to help with future decisions about

10
the organization’s training program. Using the evaluation, the organization may identify a need to
modify the training and gain information about the kinds of changes needed

It is hard to address training criteria without first reviewing Kirkpatrick's (1959) four levels—
reaction, learning, actions, and results—in the history of training evaluation. Reactions are the
degree to which participants find the training favorable, engaging and relevant to their jobs.
Learning concerns the degree to which participants acquire the intended knowledge, skills (e.g.,
principles, facts, and techniques), attitude, confidence and commitment based on their
participation in the training. Behavior is the degree to which participants apply what they learned
during training when they are back on the job. Results are defined as the targeted outcome occurs
as the result of training or the organizational impact of the training through indicators such as
increased productivity, increased sale, reduced employee turnover rate...

Although the model has its popularity explained by the number of advantages, critiques of the
Kirkpatrick’s 4 levels pointed out some critical issues that related to feasibility of the model. One
of the concerns is there is not enough persuasive evidence of consequential relationship between
two consecutive level that model assumed. For instances, the knowledge has been acquired do
not always associated with a positive change in behavior and being reflected in measurable
effectiveness of the organization. Further research into other level of Kirkpatrick model has led
to similar result. The most frequently obtained measurement test is still the participants' reaction
to the program (Sugrue and Kim 2004). Alliger and Janek (1989) identified only 12 research
articles that recorded 26 correlations between different criteria such as reaction and learning
measures in training program. They discovered that there were few connections between reaction
measurements and the other three levels (learning, behavior, and results). In line with the growing
body of research in the areas of training, a more recent meta-analysis by Alliger et al. (1997)
found 34 studies and 115 correlations across various training criteria. They discovered that
trainees expressed satisfaction with the trainer, there was almost no correlation with learning
objectives.

Kirkpatrick model has laid the foundation for the study of training evaluation and training criteria
related research. The revision of literature that concerned with each element in the model
proposed by Noe et al is closely correlated with this foundation. While applying research study

11
in our solution, we tried to avoid the limitation of the model and focus on the holistic perspective
that the model suggested.

2.1. Trainee reaction

Traditionally, training professionals have looked at trainee responses to see if they enjoyed the
curriculum or the instructor. Positive responses are regarded as creating a conducive environment
for studying the curriculum. Recent studies have broadened the definition of trainee perceptions
to include a variety of factors. Warr and Bunce (1995) discussed three types of trainee
perceptions—course satisfaction, the relevance of the course for the job, and the difficulty of the
course. Alliger et al. (1997) also distinct traditional affective reactions to training from perceptions
of utility. The utility perceptions are concerned with how relevant trainees view the training
material in terms of learning necessary skills and enhancing work performance. Morgan and
Casper (2000) using data from over 400 training classes and found support for six key factors in
trainee perceptions including the materials used, the testing process, satisfaction with the training,
the instructor and the management process, course structure, and the utility of the program for the
trainee. Brown (2005) argues that these training related perceptions can be thought of as being
hierarchical and identified evidence for the idea that there are three key facets (enjoyment,
perceived relevance, and satisfaction with the technology). Each of these aspects contributed to
the overall satisfaction with training.

2.2. Learning outcomes

Understanding and being able to quantify the data of learning outcomes such as concepts,
knowledge, skills were prioritized by preliminary studies. For instances, McGehee and Thayer
(1961) emphasized multiple choices question, retention test should be used as a way to quantify
the measure of learning outcomes.

There has been progress in our interpretation of what we understand about learning and what
learning outcomes could be the target of training evaluation. Kraiger, Ford, and Salas (1993) has
focused on specification and measurement of learning. They moved toward a conceptually based
classification scheme of learning based on such a multidimensional perspective which was
different from prior research that envisioned learning outcome solely as changes in verbal
knowledge and behavioral capacities. They grouped learning outcomes into three different
12
categories: (1) cognitive outcomes which include notion like verbal knowledge, knowledge
organizations, and cognitive strategies; (2) skill-based outcomes consisted of compilation and
automaticity; and (3) affective outcomes included attitudinal and motivational. For each category
(e.g. verbal knowledge), they provided a list of relevant learning constructs (e.g., declarative
knowledge,) and the appropriate focus for measurement (e.g., amount of knowledge, accuracy of
recall, speed and accessibility of knowledge,). Based on these assessment, they presented a set of
suggested training evaluation methods. (e.g., recognition and recall test)

Ford, Kraiger, and Merritt (in press) have updated this learning outcome approach. Ford et al. note
that training studies have begun to move well beyond the measurement of declarative knowledge
to other types of learning outcomes and evaluation methods. For example, Kozlowski and Bell
(2006) have focused on distinguishing between what they call basic (declarative) knowledge and
strategic(procedural) knowledge. These differences in knowledge capabilities are then linked to
basic and strategic task performance.

In other research, the organization of knowledge has been the focus of attention. Mental model
quality has also been found to predict outcomes over and above more traditional measures of
knowledge acquisition. For example, Davis, Curtis,and Tschetter (2003) found that the quality of
a trainee’s structural knowledge(closeness to an expert model) provided an incremental prediction
of post-training self-efficacy beyond measures of declarative

Anderson et al. (2001) revised the famous Bloom (1956) taxonomy of educational objectives. The
original taxonomy focused on cognitive, psychomotor, and affective objectives. It was arranged in
ascending order from simple to more complex knowledge with the assumption that mastery of the
simpler category was required for mastery of the next more complex category. The original
taxonomy’s framework begins with knowledge (knowledge of specifics, knowledge of ways and
means of dealing with specifics, knowledge of principles), comprehensive, application, analysis,
synthesis, and evaluation. The revised taxonomy has two aspects. One aspect is the structure of
knowledge, skill, or affect. The second aspect is the process underpin learning. The new
knowledge structure consisted of four rather than three main categories. Three of the categories is
the same as the original subcategories of knowledge—factual knowledge, conceptual knowledge,
and procedural knowledge. The fourth subcategory is coined as ‘‘metacognitive knowledge’’

13
which the researchers referred to knowledge about cognition and an awareness of one’s own
cognition.

The revised taxonomy by Anderson et al. (2001) and the updated review by Ford et al. (in press)
demonstrated that our understanding of ‘‘learning’ ’continues to expand. The frameworks of
procedural information, knowledge organization, and metacognition are beginning to be discussed
in training evaluation contexts, highlighting the need for a deeper understanding of what “learning”
entails. As the diversity of skills increasing overtime, more systematic approaches to evaluating
progress in learning as a component of training have resulted from this expanded understanding of
learning.

2.3. Transfer of training

Regardless of the amount of learning during training, a key outcome is whether individuals transfer
their increase in knowledge and skills to the job (Baldwin and Ford 1988). “Transfer of learning”
is consisted of trainees’ commitment to use the training, perceived ability to apply, and opportunity
to use the new knowledge and skills back at the workplace. Transfer of training (or lack of it) is a
complex process and depends upon the intent or motivation of the learner (trainee characteristics),
the workplace environment including supervisory support (organizational environment and
culture), and the instructional design as well as delivery features (job relevance) of the training
program. (Subedi, 2004)

In learning and development literature, “transfer of learning” and “transfer of training” are usually
used as interchangeable word. However, transfer of learning relates to generating knowledge and
information through education, which refers to the capacity to generalize and learn by analogy.
Active learning is an important criterion for transfer to occur. Active learning requires the learner
to be involved in the learning process by making conscious effort to learn. The psychological
processes of logical thinking and reasoning facilitate the process of recognizing and solving
problems in new contexts by applying the solution or analogy from the previously acquired
knowledge and skill (Misko, 1999). This process is also called ‘case based reasoning’ in transfer
of learning.

14
Existing definitions and conceptual frame-works illustrated by literature on transfer of learning or
transfer of training do not differ fundamentally. Transfer of learning derives more from a
knowledge base and generic competencies, whereas transfer of training is focused on specific
competencies (perhaps with some generic extensions) in terms of explicit or implicit use of that
knowledge, skills, and attitudes in the world of work.

Transfer of training has also been classified in terms of ‘near transfer’ and ‘far transfer’. Near
transfer of skills and knowledge refers to the replication of the previously acquired knowledge and
skills in all identical situations based on Thorndike’s theory of ‘identical elements’. Thorndike
published the results of his studies in 1901 and maintained that “training in one task was not likely
to lead to improvement in the performance of another task unless there was a clear similarity
between them”. This theory of transfer is based on the belief that previous learning facilitates new
learning only to the extent that the new learning task contains elements identical to those in the
previous task (Perkins and Salomon, 1996).

According to Misko (1995), near transfer of training often involves tasks that are procedural in
nature. These tasks include steps of operation in sequence, and the sequence of steps is repeated
every time the task is performed. This type of procedural training is relatively easy to learn and
transfer rate of learning is usually high, but the learner is unlikely to adapt such skills and
knowledge when confronted with new environment and changed conditions.

Far transfer of training refers to learning new skills or performing new tasks in situations that differ
significantly from the situations of original learning. Training conditions, which focus on far
transfer, require learners to adapt the acquired knowledge and skills as guidelines to perform or
learn in changed situations or new environments (Misko, 1995). Thus, far transfer goes beyond
repetitive application of learned behavior and involves cognition and analogy to adapt to new
challenges. This kind of transformation of learning involves analogy and cognition. Transfer of
learning from this type of training is difficult but more important than instances of near transfer
from the perspectives of higher order learning and retention.

The dilemma is that when one acquires a near-transfer skill it seems to be at the expense of far
transfer generalizability of that skill. Indeed, most training in industrial setting focuses more on
procedural and near transfer than on declarative and far transfer, though the importance of far

15
transfer is acknowledged by almost all those responsible for training (Perkins and Salomon,
Tuijnman, 1996).

Putting together the themes of definitions and types of transfer, it becomes obvious that individuals
have a tendency to change their behavior as a result of their perception and subsequently as guided
by extrinsic or intrinsic motivation. It also illustrates the fact that the evolution of research on
transfer of training draws from theories of motivation, cognition, educational psychology, and
learning to learn. Limited studies on transfer of training have focused on conditions,
characteristics, nature of transfer, and related contextual phenomenon. Transfer of training can
serve as a powerful measure of training effectiveness. However, the process of maximizing transfer
of training, by means of integrated strategies as characterized by those conditions and mechanisms,
including the influence of organizational climate and supervisory behaviors, has not received the
attention it deserves in the training literature

2.4. Performance Constructs

Kirkpatrick focused attention to changes in performance and suggested the need for a systematic
appraisal (from multiple sources) of on-the-job performance both before and after training.
McGehee and Thayer (1961) raised the issue of determining whether or not the training procedures
necessarily resulted in a change in the trainees' behaviors. This early work linking behavior change
and work performance has been supported by subsequent research on job performance. As
highlighted by Motowidlo and Schmit (1999), it is necessary to understand performance as a
behavioral construct. They also stress that the performance domain is also behaviorally
multidimensional—where different kinds of behavior can affect one’s job performance and hence
advance or hinder organizational goals.

Recent work has expanded our understanding of the performance domain. For example, Campbell
(1990) defines performance as behavior that is relevant for the organization’s goals and that can
be measured in terms of the level of proficiency or contribution to goals that is represented by a
particular set of actions. The performance model developed by Campbell and his colleagues
includes eight primary factors: job-specific task proficiency, non-job-specific task proficiency,
written and oral communication, demonstration of effort, maintenance of personal discipline,
facilitation of peer and team performance, supervision/leadership, and management

16
/administration. These factors are intended to be as distinct as possible in terms of the work
behaviors that are included in each factor or dimension (Campbell 1999). Campbell makes a clear
distinction between performance and the determinants of performance. The learning outcomes
discussed above are consistent with Campbell’s notion of direct determinants of job performance
(declarative knowledge, procedural knowledge and skill, and motivation)

Motowidlo and Schmit also provide a number of different ways of conceptualizing performance
including (1) persisting with enthusiasm and extras effort as necessary to completing tasks; (2)
volunteering to carry out tasks that are not formally part of one’s own job; (3) helping and
cooperating with others;(4) following organizational rules and procedures; and (5) endorsing,
supporting, and defending organizational objectives. They provide a variety of sources that have
identified these key dimensions and provide ways of measuring behaviors relevant to these
performance dimensions

Pulakos et al. (2000) developed a taxonomy of adaptive job performance and examined the
implications of the taxonomy for understanding and training for adaptive behavior. They identified
and found support for an eight-dimension taxonomy including dimensions of (1) handling
emergencies or crisis situations;(2) handling work stress; (3) solving problems creatively; (4)
dealing with uncertain and unpredictable work situations; (5) learning work tasks, technologies,
and procedures; (6) demonstrating intrapersonal adaptability; (7) demonstrating cultural
adaptability; and (8) demonstrating physically oriented adaptability

Finally, there has been an explosion of interest in multi rater or 360-degree feedback systems
(Bracken, Timmreck, and Church 2001). This interest has led to a number of leadership
taxonomies that focus on key leader behaviors. For example, Bracken et al. describe the
development of a leadership survey in which thirty-three behavioral statements were identified
including dimensions such as coaching/support, commitment to quality and customer satisfaction,
communication, creating a team environment, and providing feedback. Tradition-ally, multirater
systems have been used for leader development with feedback being the catalyst for motivating
behavioral change. The measurement tools can also be used by training researchers and
practitioners to measure or assess changes in behavior on the job as a function of leadership
training (Jellema, Visscher, and Scheerens 2006)

17
In 1961 McGehee and Thayer stressed the need to consider using specific criterion measures of
performance that examine changes in the specific behaviors that the training program is targeting
versus measuring summary indicators of changes in overall job performance. Since that time, there
has been much effort devoted to developing conceptually rigorous models of job performance.
These models have led to the identification of job performance dimensions that can help guide
choices made by training evaluation researchers and practitioners about the relevant performance
criteria. In addition, job performance researchers have created reliable and valid measures that can
be applied to measuring specific changes in job performance as a function of training.

2.5. Organizational Impact

As noted by McGehee and Thayer (1961), a careful and critical evaluation should answer the
question of whether the dollars being spent on training are producing the results needed by the
organization. Organizational impact is characterized as the eVects on the business and the work
environment that result from the improved performance of trainees. This level of evaluation helps
to relate the training program to organizational objectives. In today’s lean organizations, it is
becoming increasingly important that training specialists provide evidence for the return on
investment for training programs.

As noted by McGehee and Thayer (1961), a careful and critical evaluation should answer the
question of whether the dollars being spent on training are producing the results needed by the
organization. Organizational impact is characterized as the effects on the business and the work
environment that result from the improved performance of trainees. This level of evaluation helps
to relate the training program to organizational objectives. In today’s lean organizations, it is
becoming increasingly important that training specialists provide evidence for the return on
investment for training programs.

Training objectives for organizational training programs can be stated in the form of outcomes like
increased production, reduced accidents, lower rates of employee turnover and absenteeism. The
difficult aspect is to determine whether the training itself influenced these results or whether there
were other unrelated factors to training like increased pay, seasonal variations, and market place
change (Wexley and Latham 2002). Nevertheless, the Weld has a number of more powerful
methods such as return on investment analysis and utility analysis to assess organizational impact.

18
Return on Investment (ROI) analysis involves estimating the ratio of total training benefits in terms
of dollars divided by the total costs (also expressed in dollars). As the aim of an ROI analysis is to
determine the financial impact of training it is important to systematically gather information to
make any inferences. One must compute the costs of a training program in terms of program
development, hours of training, trainer fee, etc. The next step is to measure the benefits of training
and then convert those outcomes into dollar terms. This is especially difficult when training
focuses on open skills.

Phillips and Phillips (2005) outlined steps to convert outcomes into monetary benefits. The first
step is to identify a unit of improvement (number of grievances, number of promotions, etc). The
second step is to determine the value of each unit on an average (cost of having a single grievance
on an average). The third step is to determine the change in performance data in terms of the units
of focus. Once we have the total change, the fourth step is to multiply that with the value of the
unit and calculate the total value of the improvement. Kraiger (2002) shows that an ROI strategy
can not only examine short-term gains but also estimate the long-term impact of training as one
can calculate and adjust the ROI ratios for every year after the training.

ROI analysis of training can be considered part of a larger methodology for assessing
organizational impact known as utility analysis. Schneider and Schmitt (1986) pointed out that
utility analysis was originally developed in the selection Weld in order to translate validity
evidence for a test into the ‘‘dollar criteria’’ which might be more meaningful for top management.
It is a financial analysis of costs versus benefits that provides bottom line estimates. There have
been various formulae for conducting utility analysis that have been recommended ( Bou-dreau
1983;1988; Schmidt, Hunter, and Pearlman 1982). Boudreau (1991) provides an extensive
discussion of utility analysis for human resource decisions such as recruitment, selection, and
retention. He also notes that utility analysis is a special case of a multi-attribute framework that
can be applied to any functional area of personnel management.

Cascio (1982) describes the various procedures and issues related to conducting utility analysis
within a training context. He also makes a distinction between cost–benefit analysis and cost-
effectiveness analysis. Cost–benefit analysis is the comparison between training costs and non-
monetary behavioral outcomes like employee behavior, health, and safety. Cost-effectiveness

19
analysis is the comparison between training costs and benefits in monetary units like cost of
production waste and increase in sales. Cascio (1989) also developed a step-by-step procedure for
applying utility analysis to evaluating training outcomes. In the first step, here commends using a
capital budget methodology to estimate the minimum annual benefit in dollars required from any
training program. The next step is to use break-even analysis (introduced by Boudreau 1984;1991)
to estimate the minimum effect size that needs to be produced by a program to get the desired
benefits. Finally, he recommends using data from multiple studies to determine the actual cost–
benefit ratio for a training program.

Utility analysis is always deficient in some way as not all factors can be built into the formula for
estimating costs and benefits. As noted by Wexley and Latham (2002) there is a tendency to
overemphasize quantitative outcomes and under-represent less tangible outcomes that are not
easily quantifiable. There is also the issue of how cost-related results can be influenced by factors
outside the control of the training function. In addition, there are certain training programs like the
outdoor management training (open skills) that are not designed to necessarily result in bottom
line tangible outcomes.

Despite these difficult issues, there are now a number of examples of innovative approaches to
examining organizational impact that minimize these disadvantages and thus provide hard
evidence for the usefulness of a variety of organizational impact analyses. These studies balance
the difficult task of developing expanded and detailed utility analyses with the need to minimize
the complexity that makes it difficult to communicate the models and their findings to
organizational decision makers.

Mathieu and Leonard (1987) provided one of the first empirical studies that showed how utility
analysis can be applied to training—in this case a supervisory training program. Their utility
formula to calculate dollar value expanded upon previous selection utility formulae (Schmidt,
Hunter, and Pearlman 1982) by taking into account the potential for diminished effects of training
on performance over time and the impact of turnover. Their formula included the number of years
over which the utility estimates were calculated, the marginal utility gained in a particular year,
the total number of groups trained, the number of trainees per group adjusted for turnover, the
standard deviation of job performance in dollar units, the effect size estimate for the training group,

20
and the cost of training per year of training. They employed a quasi-experimental design (with a
matched control group—post hoc) to examine training effects on job performance (performance
appraisal ratings). They also conducted break-even analysis and an employee flow analysis. The
results indicated a substantial dollar benefit to the organization from training the supervisors.
Despite the cost of training (over $50,000), the net utility for training sixty-five supervisors was
found to be over$34,000 for the first year with increased benefits for a number of years beyond
the initial training.

Morrow, Jarrett, and Rupinski (1997) describe a four-year study that investigated the effect and
utility of managerial and sales training. The study expanded our understanding of the
organizational impact of training by focusing on multiple training programs within one
organization. They used a quasi-experimental de-sign, incorporated meta-analytic information to
summarize the impact of different types of training, and used a particular type of utility analysis
(Raju, Burke, and Normand 1990) to estimate the economic impact of the training. They found
great variation in the effectiveness of eighteen training programs within an organization. They also
found that a more specific sales training program had a greater impact on job performance and a
greater return on investment than the broader managerial training program.

Another area that has received attention and also stems from the cost-saving perspective is the idea
of retrospective pretests. Hill and Betz (2005) emphasize how both retrospective pre-tests and
traditional pre-tests have certain biases and that there are tradeoVs in using one over the other.
They stress that traditional pretests are better at examining program eVects and that retrospective
pretests are better at evaluating subjective experiences of program-related changes.

3. Method of inquiry
To carry out the Capstone project, we use Primary research: directly work as Interns at the Hawee
training center to observe and collect the information as input for the project.

The method of inquiry we used to conduct the project is Hypothetico-deductive method, at first,
we based on observations to generally identify and speculate on the problem. After several
consultation sessions with teachers, we realize that the assumptions that the problems about
Assessment and Post-course evaluation the English Center is facing are based on our personal
opinions only and need more subjective multi-dimensional opinions from different people

21
participating in the English course like: Trainees, Trainers and Training staff. As a result, we
decided to do more research and design a reference experiment to get more proof that our
assumption is correct. Then, we designed a list of interview questions with open-ended questions
for these 3 different target subjects to answer freely in their own way and simultaneously suggest
them some points to fully get the data that we want to exploit. The above subjects were asked
whether they were facing any problems when participating in the English post course assessment
and evaluation process or not to test the hypothesis that our group initially proposed.

The information is gathered mainly by 2 team members working as interns at the Training Center.
We firstly contacted then divided tasks to conduct direct interviews, take notes, record and
combined with observation of the course's post-training assessment and evaluation processes the
English is currently implementing. Document and record of existing data in the accessible data
warehouse of the Training center were used for our research and analysis.

The main information collected is:

Qualitative: Students' test results from previous years

Quantitative: Exam organization plan, English capability test, Post-course evaluation form for
students in previous years, personal opinions on the current general English course organization
process from Trainees, Trainers and Training staff.

Table 2.1. Method of inquiry

To analyze the collected data, we use 2 out of the following 4 analysis methods:

22
Table 2.2. Four types of Analytics

Descriptive analysis: Two members of the Capstone project team working as interns in Training
center summarize the above key information with teammate to analyze and thoroughly understand
the ongoing post-course assessment and evaluation process and find Identify the good points that
need to be promoted along with the gaps through the multi-dimensional feedback obtained.

Finally, our team used Proposed analysis to draw the conclusion that there is a need for an
effective detailed post-course evaluation process to standardize the process of English center,
facilitate the organization of the course, obtain cooperation from stakeholders and have data to
improve future courses. Our initial assumption has also been verified to be true.

4. Result of the project

As mentioned in the proposal report, we can definitely prove that Hawee does not have any process
to evaluate the effectiveness of learning English in focus class. After conducting interview sections
with 3 different groups of people including the English teachers, training staff, and current students
of the English center, that helps us to clarify the issue in the proposal report. With each
distinguished group of interviewees, we can conclude different problems that the English center
needs to improve in after-training evaluation. In the rationale report, this part will clarify the issue
deeply based on the evidence through conducting interviews.

23
4.1. Inexplicit and inadequate course information provided

When interviewed with the students who are having a English course provided by the Training
Center, Mr. Nghia claimed that he had no idea about the plan as well as the needed information
about the course which he was invited to learn. He said “As a student, I currently have not been
informed about the course framework, initial goals as well as the standard English language
proficiency framework set by the English Center for Hawee personnel at different job positions”.

In the interview with the current English student, he belongs to the 2021 ME Purchase class, one
of the focused classes. The total training duration of 30 sessions (~ 45 training hours). With the
experience, he does not expect and demand too much on learning English at the company. The
main purpose of studying is to maintain my English ability. He has not set the goal of increasing
his English level, if so, he thinks he should study outside. After that, we could say that the
effectiveness of the English training course at Hawee is not attractive enough for the student to put
more effort into learning academic English. It also proves that lacking after training evaluation
leads to inefficient learning English. The rising issue during English courses, students who do not
know what the framework is and how the course will run, now will have a hard time trying to
achieve the goal setting criteria.

The English course provided by Hawee has its own advantages like designing assessment
depending on the relevant expertise of the department. Additionally, each class has its own topic
relating to their job. In order to ensure the relevance between the knowledge learned and the
trainee’ professional work, the English center is aiming to develop the course content and exam
questions together with a qualified person such as the head of the department. The course is made
specifically for the department with the relevant knowledge to the job. For example, class 2021
ME Purchase is for the Purchasing Department. Hence, the Training Center did its best to provide
the related topic in English such as Business English course, negotiation, etc.

Learning English outside, students will have to pay fees according to their needs, so all students'
issues are adjusted immediately. Whether they think the course is fit or not with their ability and
possibility, they can give feedback and get the right away adjustment in order to help them achieve
their learning target. The current company's course is not heavy on processes; all issues are
communicated directly between parties so it's also comfortable. But after oral comments, no text

24
or document is stored to keep track and record the problem, which does not make sure that the
problem will not be repeated. There are noticeable advantages of direct orally feedback and
comment, when action is required to be taken immediately it is best to transmit a message orally.
If the executive’s workload is high then they stop writing and by oral instructions, they complete
their message transmission and release their workload and also it saves time. Moreover, by the
demand of the situations, oral instructions can be changed easily and for these cases maintain the
formalities are not necessary. So it is very much flexible and effective. Despite the advantages,
there are some drawbacks of direct oral communication, in oral communication, messages are
difficult to record. So it is impossible to preserve the message for the future. Important and secret
information may be disclosed. For instance, if the problem goes with the teaching materials,
training staff can have a conservation with the teacher to adjust the suitable lesson learned in order
to fit the learner’s ability and knowledge. However, if the problem occurs from the teacher's side,
they may not be happy to receive bad feedback about the teaching issues.

Moreover, one of the biggest issues is that students have not been informed about the course
framework, initial goals as well as the standard English language proficiency framework set by the
English Center for Hawee personnel at different job positions. That will lead to misunderstanding
the purpose of the center's general goals, and it may be difficult and challenging for the students
to set the specific goals after training courses. Clearly written course-level and module-level
outcomes are the foundation upon which effective courses are designed. Outcomes inform both
the way trainees are evaluated in a course and the way a course will be organized. Effective
learning outcomes are student-centered, measurable, concise, meaningful, achievable and
outcome-based (rather than task-based). Outcomes are phrased from the perspective of the student
and are written in language that can be easily understood by them. If there was no connection
between center’s goals and trainee’ learning purpose, they will not make attempt to achieve it.

4.2. The change in the Training format - Confused demand

The second group we chose to interview is English teachers. The interview section lasted for 20
minutes, which raised several issues which the teacher faced when teaching in Hawee as a
cooperator. Ms. Mai, a teacher who teaches in Hawee from the 1st course till now, is not sure to
state whether it's necessary to have a multi-dimensional feedback form because the center’s model

25
is changing compared to before, but as a training staff, oral handling can solve the problem
immediately. However, in the long-term, there is no data to draw lessons for later courses.
First and foremost, in terms of interviews with English teachers, who work directly with students
and have interaction with the training staff. English teacher was from 1-1 class, Focused class,
belonging to the Purchase Department, having IELTS 7.0, which is enough qualification to deliver
English teaching in the center. As her answer, usually, the test is examined and evaluated by the
Training center. She only supports the Speaking test because the speaking test must be 1-1, so
there is a need for someone to judge. In such exams, it was counted as a regular teaching session.
In the past, the company had a clear evaluation process, but now it seems to be reduced. Currently,
she does not have any suggestions because it seems that the Training center is changing the English
learning model, not the same as before. Because of the change in internal process, they skipped
the evaluation test in order to shorten the steps, which does not bring good results at the end of the
English courses.
To improve the significance of the after-training evaluation, that is a way to collect such raw
information, feedback, how the learners access learning, how they progress, where they drop out,
and much more important information like these. This helps in understanding what needs to be
replicated, what needs to be modified, and what needs to be improved in a particular training
program. There are several benefits we recorded to prove that the evaluation process after training
is necessary:
 Accountability

A successful training evaluation works as a checkpoint to ensure and measure the effectiveness of
training. A well-laid-out training evaluation process helps to bring greater accountability by
ensuring the end objectives of training are met and there are no compromises on deliverables from
either side.

 Enhance Credibility

The historical data of any training evaluation can help to gain the credibility of the training program
to future participants. Since the data can explain the effectiveness of a particular training program
and how it helped participants to enhance productivity, this helps to build the credibility of the
training program.

26
 Feedback Transparency

Feedback is one of the most important steps in the overall training evaluation. Done correctly it
helps to understand the gaps to be filled in the training and also can act as a guide to the
amendments required in implementing the training.

 Boosts Training Effectiveness

The right training evaluation process ensures that it enhances the training effectiveness by bridging
the gaps in the program and ensuring the overall quality of the training program is maintained at
all levels. The evaluation process also helps to bring standardization to the learning strategy.

Moreover, she said, currently she does not have any problem during the training. If there is any
problem, she can talk directly to Trang - the staff in charge of English program in the Training
Center. She will discuss it directly without filling out a form. All problems will be solved
immediately, so the quality and improvement of the next course cannot be compared with the
previous ones. Although her problems will be solved immediately, the training staff do not record
any complaints or any problems in order to completely solve the root of the problem. The lack of
assessment evaluation sheets and recording teacher feedback on training-related issues can lead to
lack of improvement in the next training courses.

Additionally, we ask whether she received a review of the course process, or course evaluation
from the Training center at the end of the course or not. She said previously, there was a procedure
and form to evaluate the course after teaching. Every time that the course closed, the Training
center sent a general evaluation form which asked about the satisfaction of teachers about the
quality of the organization as well as the students. At the end of the form, there will be suggestions
for improving the quality of the following courses. But now it feels like it's already cut off. This
evaluation form will help the Training center to have the most general overview of the courses that
have been implemented. Such a review form will be easy to store information as well as make
reviews, comparison and improvement as well as the decision. However, whether the form of
quality assessment for teachers is necessary or not, it depends on the course orientation, scale,
objectives which are set by the Training center. Last but not least, she said that at present, the after

27
- training evaluation process is not as clear as before due to the change of needs and orientation
that the Training center built for English courses.

4.3. Unpremeditated in controlling courses

Last but not least, the interview with the third group - training staff has given us adequate
information to conclude about the necessity of the measurement process. The main problem raised
by the training staff is: the current assessment process is very spontaneous; it is difficult to pass on
to others like training coordinators to follow because it is not standardized in the form.

Particularly, in an interview with the training staff, who have responsibility in organizing and
coordinating English classes, and building English courses: Materials, curriculum framework,
Exam content /assessment..., we concluded that the center lacks a training process. The assessment
process is not clear and very outburst, manipulated by the subjective opinion of the English training
staff as well as has not received the cooperation from related parties to collect information. There
is only the monthly and periodic assessment of students on performance. Additionally, the process
of collecting and evaluating the training courses with stakeholders is completely unavailable and
has not been applied formally to any classes from the restart of the English center from July 2020
up to now. The main reason is due to the lack of staff to undertake the job.

When we asked about the necessity to standardize and clarify each detail of the evaluation process.
She said that the center definitely needs a training process because there is a growth in the size of
the English center and the number of classes. The more diverse the organizing format and content
are, the more standardized the general organizing and Post-course evaluation process requires. It
will help to avoid spontaneous judgement and the status of letting the grass grow under your feet.
However, it is necessary to consider the diversity of classes in terms of content, English
requirements and ability of students... so the post-course assessment process needs to be flexibly
adjusted in terms of assessment time, length of assessment, and the coordination with other related
parties… so that the process can be flexibly applied to those focused classes. This is also a
challenge in the process of building the procedure that the English Language Center is facing.

28
Furthermore, asking for the solution, she suggested a form which is suitable and provides sufficient
information for business reporting. “The center can design this form to be both flexible and have
a common model guide on how to organize for each class”, Ms. Thao said.

In the literature review, we applied a 5-pointed star model to build the evaluation form, is it
reasonable? Completely reasonable, however, to respond and build the evaluation process under
this model, it takes a lot of time and effort, it needs a flexible and reasonable application of the
organizers - the training staff and coordination collaborator, but brings about many advantages.
For instance, it brings a lot of input data so that the English center can understand the wishes and
feedback of students in order to gradually improve their course organization activities and the
organizers can also evaluate realistically and comprehensively the effect of the course brings not
only individual learners but also their contribution to the work and our company. Having a clear
evaluation process also avoids the situation of the old English center (2019). Each person has
specialized tasks as a result the staff do not coordinate effectively with each other and work without
procedure.

5. Conclusion and Action plan

5.1. Conclusion

As we mentioned above, after the interviews and research, Hawee’s English courses currently
provide lack of information systems to both teachers and trainees. Learners take the training are
not clear about their framework, the criteria to evaluate before and after the courses. Thus, this
cause the confusion among teachers and learners in training and goal-setting. Most of those
comment, feedback and information normally delivered through oral communication without any
record or document storage, which raise the confusion and inadequate information even higher
and make both trainers and trainees misunderstand about the courses goal. The oral handling is
also applied in the lesson issues including changing the method of delivering the lesson, changing
the courses outline. Currently, in short-term, this may work well and immediately; However, in
long-term, this can cause unclear and confused for teacher to get the training format. Also, as there
is no a multi-dimensional evaluation, teachers and students do not get their adequate information
and benefit when choosing the courses. According to training staff belong to the group who is
responsible for organizing and coordinating English courses, the assessment process is not clear

29
and very outburst, manipulated by the subjective opinion of the English training staff as well as
has not received the cooperation from related parties to collect information. As a result, they cannot
conclude adequately to enhance and control the training courses properly.
All the issue announced above are caused by the Hawee’s lack in process and record of evaluating
the effectiveness of company English courses. And in long-term, this leads to the confusion and
lack of improvement for the courses quality and satisfaction for all parties.

Hence, that issue led us to an idea of placing a proper evaluation process for trainees and trainers
after every course, which allows them to give feedback, comment or any relevant suggestions for
their previous courses. The evaluation process will work based on the 5 different aspects: trainee’s
satisfaction with the program, knowledge or abilities gained, use of new skills and behavior on the
job (transfer of training), and improvements in individual and organizational performance. These
aspects will be tested right after the courses and in a few months later after the courses to ensure
the accuracy and efficiency of English training in the long-term. Based on their evaluation form
collected in the process, the training and teaching department will together develop and enhance
their current training to meet learners’ need and demand. This process will efficiently provide
sources and recommendations for the all related departments to organize an effective and qualified
training. For detail information during each phase of evaluation process, we will provide in the
next part, Action Plan and Appendix.

5.2. Action plan

Following all our research and analysis, we come to the final of an evaluation program for the
courses, which goes through six main phases: Plan Assessment and Evaluation methods,
Assessment Organizing, Courses Evaluating Handing-out, Data collection, Analysing and Report,
and Capacity Record and Data Storing — that complement the phases of the program action plan.
In this section, each of the six phases is discussed.

Stage 1: Plan Assessment and Evaluation methods

In the early stages of this procedure, after our adequate research and development of plan, we will
report and present our project to the Board of Directors to get the approval and funding for the
project. This phase will take the board about 2 weeks to consider our proposal, weigh the pros and
cons of the project to come to their final decision. If the project gets the confirmation from the

30
boards, we can carry out all our further action with company funding, otherwise we need to check
and improve the project to adapt the company citation as well as board of director’s requirements.

Stage 2: Assessment Organizing

After getting the approval from the Board of Directors, the training and teaching team will carry
on a final test for all trainees. They will have to take this test to examine their current knowledge
and skills, and conclude what they have learned and gained after a course.

This test provides information on whether or not participants have learned from the training. In
addition, a well-designed test can help trainers understand which concepts or competencies were
well taught during the training and which ones need additional time, or need to be covered using
alternative methods.

Tests are instruments or tools used to measure change. If the instrument itself is faulty, it cannot
accurately measure changes in knowledge. Thus, the training department along with the academic
teacher team need to develop a proper test for all students, with well-written and clear questions.
The questions will mainly focus on courses’ objectives which are providing new skills, knowledge
to trainees. The final test will examine all participants English aspects: Listening, Speaking,
Reading and Writing to ensure they have grown their skills comprehensively. In addition, there
will be some situational questions which check on how they transfer and apply knowledge during
the training into their real work. Case-based questions require participants to transfer their training
knowledge to a situation they will likely encounter in their workplace.

Operating the final test, the training and teaching department will need to cooperate to build an
optimal test to check and summarize all the information delivered throughout the training.
Analysing the courses outline and generate the possible situation in reality which can apply their
learned skills and knowledge.

Stage 3: Courses evaluation Handing out

As we mentioned before, there's no proper evaluation process in all Hawee English courses, which
means after the final test, the training team will summarize all the results, report and store all the
related files. Thus, the training team hardly knows what the trainees really got through the training,

31
what they expect in the next courses and what they are not really satisfied with in the course. As a
result, the English training classes keep on going without any feedback or recommendations for
better quality in teaching and mentoring in the future.

Therefore, with the information collected through our interview and research, we decide to input
a phase of after courses evaluation which requires all the training relevant individuals to evaluate
and give feedback. By building 2 questions form for trainers and trainees after they finish the test,
we can check their satisfaction, expectation and feedback for further courses. All the questions
will focus on how trainee/trainer rate their courses quality, what they have learned and wanted to
focus on more in the programs. With the question forms, we can directly check trainees’
satisfaction including their enjoyment, perceived relevance, satisfaction technology with 1 or 2
open-ended questions. Also, we can get and measure their knowledge, skills, and performance
improvement by the Campbell behavior model and KSAO (Knowledge, skills, attitude and others).
One last criteria we can evaluate through the questions form is their transfer of training, how well
they can apply their knowledge in real work. For more detail, our two separated forms for trainers
and trainees will be attached in the appendix.

In addition, to have the overview and proper summarization of their comment and feedback, we
will generate the data, make sure that these are clear and accurate enough to put it in correlation,
which make it easier and more logical for the training team to give recommendations and
suggestions for the next other courses.

Stage 4: Data Collection

In data collection, the training department will firstly generate an evaluation matrix as a document
that the training team uses throughout the data-collection process with a view to structuring and
recording all collected information. The evaluation matrix will put all the collected data in a
correlation of variables which arrange a wide range of information data in an easy-to-follow and
logical way. As this is a crucial phase which provides to the whole team current strengths and
weaknesses, as well as their issue in the trainees and trainers point of view, the accuracy and
integrity of information must be taken seriously with strict observation and checking during the
data collection process.

32
Throughout the field phase, each member of the training team will gradually complete her/his
individual copy of the evaluation matrix with the data and information collected during our
research, interview and evaluation form results. The team leader, in this phase, need to strictly
observe and assign proper work for each member.

Stage 5: Analyzing and Reporting

The training department will review the information they have collected, and filter, categorize and
interpret it so that it can be used to develop and improve the course quality. These findings will be
the building blocks in formulating evidence-based answers to the evaluation questions. The
answers to the evaluation form will, in turn, form the basis for conclusions and recommendations

In Analysing and Reporting, the training team will first consolidate the evaluation matrix done
above by team’s cross reviewing and leader’s double-checking to make sure the information input
in the evaluation matrix are accurate and follow the same flow for the team to follow. The final
document needs to contain all of the data and information that the team collected in each evaluation
question.

In the next part, they will sit together, review all of the data in the consolidated evaluation matrix
and decide what information is: (a) necessary; and (b) of sufficient quality to inform each
evaluation question indicator. Data that is found to be unnecessary or unreliable should be removed
from the evaluation matrix. The findings should logically flow from the information related to
indicators. Finally, team need to construct complete chains of reasoning: from evidence to findings
to answers to the evaluation questions.

In addition, with evidence-based statistics and valid answers from the evaluation form, the training
department will draw conclusions on the performance of the previous courses - its strength,
weakness and issues. These conclusions typically cut across the individual themes or topics of the
evaluation questions. Conclusions are the basis for practical and concrete recommendations, which
are based on the results of the evaluation, research and interview.

33
The final evaluation report is the main deliverable of the reporting and analysis phase and of the
evaluation overall. At the core of the report is: (i) the presentation of the findings, formulated as
answers to the evaluation questions; (ii) the conclusions deriving from the findings; and (iii) the
recommendations. The supporting evidence is presented within the evaluation matrix, which must
be attached to the final report. The report also explains the purpose, objective, scope and
methodology of the evaluation, and provides an overview of the company context and situation.

Stage 6: Capacity Record and Data Storing

Document storage is the process by which all the static records above are stored until they need to
be accessed in the future. As the outstanding obstacles of most issues for all parties above are
related to assessing the data, the document archiving as consequence, become seriously important
and required the most efficient and convenient method so that all parties can easily catch up with
that information.

Hence, in this phase, we will use digital storage to archive our file and keep it as reference for
further courses’ improvement. With online archiving, the documents are continually tracked, so
they can’t be misplaced and can’t be retrieved or changed without this being recorded. In addition,
the training team can easily find and access the documents they need, make comparisons and see
the statistics shown visually. All documents will be preserved in the secure and convenient way
for team to over watch, compare and get the information immediately.

34
6. Deliverable as appendix

APPENDIX A: COURSE EVALUATION PROCESS

35
APPENDIX B: POST-COURSE EVALUATION FORM (built by 5 points evaluation
model)

1. Post- course evaluation form for student:

Link google form: https://forms.gle/KBadwbqBS8JukCF37

Questionares:

36
7/6/2021 POST ENGLISH COURSE EVALUATION FORM

POST ENGLISH COURSE EVALUATION


FORM
The Training Center would like to thank you for taking the time to participate in the English
training program for the focused group of ABCXYZ Department.

In order to contribute to improving the quality of the course in the future, Training Center
would love to hear some comment on what we have done for the course in the form below.
* Required

PART 1: TRAINEE GENERAL INFORMATION

1. Class Code *

2. Full name *

3. ID *

PART 2: TRAINEE SATISFACTION

Enjoyment

4. How do you feel about the English training course? Why? *

https://docs.google.com/forms/d/1_4UrpMl6TD8fhWbapNRCJ2McJIcu8XvU27lUC8Lvskg/edit 1/4
APPENDIX B: POST-COURSE EVALUATION FORM (built by 5 points evaluation
model)

1. Post- course evaluation form for student:

Link google form: https://forms.gle/KBadwbqBS8JukCF37

Questionares:

36
7/6/2021 POST ENGLISH COURSE EVALUATION FORM

POST ENGLISH COURSE EVALUATION


FORM
The Training Center would like to thank you for taking the time to participate in the English
training program for the focused group of ABCXYZ Department.

In order to contribute to improving the quality of the course in the future, Training Center
would love to hear some comment on what we have done for the course in the form below.
* Required

PART 1: TRAINEE GENERAL INFORMATION

1. Class Code *

2. Full name *

3. ID *

PART 2: TRAINEE SATISFACTION

Enjoyment

4. How do you feel about the English training course? Why? *

https://docs.google.com/forms/d/1_4UrpMl6TD8fhWbapNRCJ2McJIcu8XvU27lUC8Lvskg/edit 1/4
7/6/2021 POST ENGLISH COURSE EVALUATION FORM

Perceived Relevance

5. Which modules did you find to be the most relevant to your job? *

6. Which modules did you find to be the least relevant to your job? *

7. What additional information do you suggest be added to the program? *

Satisfaction with organization

8. Rate the impact of the following factors: *

Mark only one oval per row.

Very Not Not at


Somewhat Undecided
much really all

Support for the course (sending


notices, reminding class
schedules, logistics)

Fully equipped teaching


aids/tools, operating smoothly

Convenient training location,


classroom layout suitable for the
learners number

Timing of the course is


reasonable, does not affect the
workflow

https://docs.google.com/forms/d/1_4UrpMl6TD8fhWbapNRCJ2McJIcu8XvU27lUC8Lvskg/edit 2/4
7/6/2021 POST ENGLISH COURSE EVALUATION FORM

PART 3: TRANSFER OF TRAINING

9. How committed are you to applying what you learned back on the job? *

Mark only one oval per row.

Very much Somewhat Undecided Not really Not at all

Degree of commitment

10. What barriers to applying what you learned do you anticipate? What could be
done to remove them?

11. What specific skills do you plan to apply when you get back to your job? *

https://docs.google.com/forms/d/1_4UrpMl6TD8fhWbapNRCJ2McJIcu8XvU27lUC8Lvskg/edit 3/4
7/6/2021 POST ENGLISH COURSE EVALUATION FORM

12. What additional support will you need to implement what you learned? *

PART 4: PERFORMANCE CONSTRUCT

13. The degree to which *

Mark only one oval per row.

Very Not Not at


Somewhat Neutral
much much all

You understand the course


knowledge?

You are proficient in Reading,


Listening, Speaking, Writing

You believe this course’s content is


important to succeeding on the job

You believe it will be worthwhile to


apply what I learned on the job

This content is neither created nor endorsed by Google.

 Forms

https://docs.google.com/forms/d/1_4UrpMl6TD8fhWbapNRCJ2McJIcu8XvU27lUC8Lvskg/edit 4/4
2. Post-course Evaluation form for Student after 6 months:
Link to google form: https://forms.gle/WqaJyKWKrrBkDvGw9

Questionares:

41
7/6/2021 ENGLISH COURSE EVALUATION FORM FOR EXTENDED USE

ENGLISH COURSE EVALUATION FORM


FOR EXTENDED USE
In order to measure effectively what the course have made impact on the trainee in all aspect
so that we can have data to build more valuable course in the future, Training Center would
love to hear your feedback in the form below to describe realistically what have you gained
after 6 months from the end of the course.
* Required

PART 1: TRAINEE GENERAL INFORMATION

1. Class Code *

2. Full name *

3. ID *

PART 2: EVALUATION

https://docs.google.com/forms/d/1H1oeURKmFFg1ryMuQLxmHVKPYkt59ga3TICKiQ8ZDmo/edit 1/4
7/6/2021 ENGLISH COURSE EVALUATION FORM FOR EXTENDED USE

4. What degree your application of the knowledge in the course to your work *

Mark only one oval.

Little or no application

Mild degree of application

Moderate degree of application

Strong degree of application

Very strong degree of application, and desire to help others do the same

5. Rate the contribution of each of the following factors to your effective


performance *

Mark only one oval per row.

Very Not Not at


Somewhat Undecided
much really all

Support and/or encouragement

Effective system of accountability


or monitoring

Belief that it would help me to be


more effective in my work

Ongoing training I have received


after the initial class

Payment of bonus for applying


the knowledge

https://docs.google.com/forms/d/1H1oeURKmFFg1ryMuQLxmHVKPYkt59ga3TICKiQ8ZDmo/edit 2/4
7/6/2021 ENGLISH COURSE EVALUATION FORM FOR EXTENDED USE

6. Have you struggled with application? If so, to what do you attribute your difficulty?
*

7. What steps do you plan to take in the future to continue your progress? *

8. I have seen an impact in the following areas as a result of applying what I learned
(check all that apply): *

Check all that apply.

Increased productivity

Improved quality

Increased personal confidence

Increased customer satisfaction

Stronger relationships with my colleagues

More respect from my peers

Better organization in my work


Other:

https://docs.google.com/forms/d/1H1oeURKmFFg1ryMuQLxmHVKPYkt59ga3TICKiQ8ZDmo/edit 3/4
7/6/2021 ENGLISH COURSE EVALUATION FORM FOR EXTENDED USE

9. Please give an example of the success you have achieved since attending this
training. *

10. To what degree have the results you expected actually occurred? *

Mark only one oval.

1 2 3 4 5

Satisfied Unsatisfied

11. What additional outcomes are you hoping to achieve from your efforts?

This content is neither created nor endorsed by Google.

 Forms

https://docs.google.com/forms/d/1H1oeURKmFFg1ryMuQLxmHVKPYkt59ga3TICKiQ8ZDmo/edit 4/4
3. Post-course Evaluation form for teachers:
Link to google form:
https://forms.gle/d5QQUs55r1QwTRYh7
Questionares:

46
7/6/2021 Post-course Teacher's Feedback

Post-course Teacher's Feedback


The Training Center would like to thank you for cooperating with us in our English program for
focused group. Here comes to the end of the course ABCXYZ, Training Center would love to
hear some comment on what we have done for the course in order to contribute to improving
the quality of the course in the future
* Required

1. Teaching class code *

2. Teacher's name: *

3. ID *

Course evaluation

https://docs.google.com/forms/d/16Jw3CjfgeAwkLqCcZhYp3ppa64VvDUdesKvy7L9bp3s/edit 1/4
7/6/2021 Post-course Teacher's Feedback

4. How do you feel about the training course? *

Mark only one oval per row.

Not
Agree Somewhat Undecided Disagree
really

Convenient training location,


classroom layout suitable for the
class

Adequate training equipment


(board, pen, speakers, laptop,
projector, etc.)

Timing of the course is


reasonable

Staff are willing to support for


the course (sending notices,
reminding class schedules,
logistics)

Appropriate policies (related to


compensation and commission)

5. Was there anything about your experience that interfered with your teaching? If so,
please specify?

https://docs.google.com/forms/d/16Jw3CjfgeAwkLqCcZhYp3ppa64VvDUdesKvy7L9bp3s/edit 2/4
7/6/2021 Post-course Teacher's Feedback

6. How do you feel about the training course materials? *

Mark only one oval per row.

Not
Agree Somewhat Undecided Disagree
really

Training materials are suitable


for the most of the students

It is easy to get the resources I


need to teach effectively

Teaching documents are updated


frequently

I am given regular and useful


feedback from the student to
improve my teaching skill

Test content are able to test the


students’ capability

7. What specific outcomes are you hoping to achieve as a result of your efforts? *

https://docs.google.com/forms/d/16Jw3CjfgeAwkLqCcZhYp3ppa64VvDUdesKvy7L9bp3s/edit 3/4
7/6/2021 Post-course Teacher's Feedback

8. Do you have any additional feedback you would like to share regarding the policies,
evaluation system, or administration? *

9. What additional information do you suggest be added to the program? *

This content is neither created nor endorsed by Google.

 Forms

https://docs.google.com/forms/d/16Jw3CjfgeAwkLqCcZhYp3ppa64VvDUdesKvy7L9bp3s/edit 4/4
APPENDIX C: INTERVIEW TRANSCRIPTS

1. Interview with Current student of English center

Basic information about the class:

Class: 2021 ME Purchase

Students: Staff of Purchase Department (3 students)

Training content: Business English

Opening schedule: December 15, 2020

Frequency: 2 times/week

Teacher: Nguyen Phuong Mai (IELTS 7.0)

Training Duration: Total duration of 30 sessions (~ 45 training hours)

So far, have you taken any English courses at the English Center?

I haven't.

Your experience of the English course?

Overall, I feel satisfied because I did not initially expect and demand too much on learning English
at the company. The main purpose of studying is to maintain my English ability. I have not set the
goal of increasing my English level, if so, I think I should study outside.

Compare external English courses with courses in Hawee?

Learning English outside, students will have to pay fees according to their needs, so all students'
issues are adjusted immediately. The current company's course is not heavy on processes, all issues
are communicated directly between parties so it's also comfortable.

Do you have a grasp of the output objectives, course framework that the English Center
builds for your class?

51
As a student, I currently have not been informed about the course framework, initial goals as well
as the standard English language proficiency framework set by the English Center for Hawee
personnel at different job positions.

If you have a problem, do you know about the contact with the English Center?

The main contact between my English class and the English center will be Ms. Binh (Head of the
Purchase department) and Ms. Ha Trang (Training staff), so when there is any question or problem,
they will talk directly with each other.

Your suggestions in organizing examinations for students' outcome assessment?

You should develop tests based on the student's job position, not the test that applies to all.

Do you have any suggestions for a training program?

You should arrange classrooms at the company office for the convenience of students instead of
having to move to the Training Center which is far away from our workplace and requires time to
travel.

52
2. Interview with English Teacher

Teaching class: 1-1, Focused class - Purchase Department (3 students)

Qualification: IELTS 7.0

1. Do you participate in the student evaluation process after the teaching? Do you have any
suggestions?

Usually, the test is examined and evaluated by the Training center. I only support the Speaking
test because the speaking test must be 1-1, so there is a need for someone to judge. In such exams,
it was counted as a regular teaching session.

In the past, our company had a clear evaluation process, but now it seems to be reduced. Currently,
I don’t have any suggestions because it seems that Training center is changing the English learning
model, not the same as before

2. Are there any problems during the training? If yes, with whom? And after the discussion,
has the problem been improved?

No, I'm fine. If there is any problem, you can talk directly to Trang - the staff in charge of English
program in the Training Center.

I will discuss directly without filling out a form, all problems will be solved immediately, so the
quality and improvement of the next course cannot be compared with the previous ones.

3. So at the end of the course, did you receive a review of the course process, or course
evaluation from the Training center?

Previous, yes, there was a procedure and form to evaluate the course after teaching. Every time
that the course closed, the Training center sent a general evaluation form which asked about the
satisfaction of teachers about the quality of the course organization as well as the students. At the
end of the form, there will be suggestions for improving the quality of the following courses.

But now it feels like it's already cut off.

53
4. So, in your opinion, should the Training center do it again or should it cut off the
evaluation process as it is doing now?

Well, yes, this will help the Training center to have the most general overview of the courses that
have been implemented. Such a review form will be easy to store information as well as make
reviews, comparison and improvement as well as the decision.

However, this is only my personal opinion, whether the form of quality assessment for teachers is
necessary or not depends on the course orientation, scale, objectives which are set by the Training
center.

5. How do you feel about the post-training assessment process of the center?

As I have said, at present, the Post - course evaluation process is not as clear as before due to the
change of needs and orientation that the Training center built for English courses. That is my
personal opinion.

54
3. Interview with Training Staff

Basic information about the staff:

Position: Training staff at English center

Experience: 3 years

Main job duties:

 Organize and coordinate English classes

 Building English courses: Materials, curriculum framework, Exam content / assessment,


...

How do you comment on the Post-course activities of the English Center now?

The assessment process is not clear and very outburst, manipulated by the subjective opinion of
the English training staff as well as has not received the cooperation from related parties to collect
information.

There is only the monthly and periodic assessment of students on performance. Additionally, the
process of collecting and evaluating the training courses with stakeholders is completely
unavailable and has not been applied formally to any classes from the restart of the English center
from July 2020 up to now.

The main reason is due to the lack of staff to undertake the job.

If you can adjust, what will you adjust in the post-training assessment process?

I want all related parties to understand each stage of the evaluation process for effective
coordination.

Do you think it is necessary to standardize and clarify each detail of the evaluation process?

Yes, sure. There is a growth in the size of the English center and the number of classes. The more
diverse the organizing format and content are, the more standardized the general organizing and

55
Post-course evaluation process requires. It will help to avoid spontaneous judgement and the status
of letting the grass grow under your feet.

However, it is necessary to consider the diversity of classes in terms of content, English


requirements and ability of students, etc. so the post-course assessment process needs to be
flexibly adjusted in terms of assessment time, length of assessment, and the coordination with
other related parties… so that the process can be flexibly applied to those focused classes. This is
also a challenge in the process of building the procedure that the English Language Center is
facing.

Do you think the 5-pointed star model to build the evaluation form is reasonable?

Completely reasonable, however, to respond and build the evaluation process under this model, it
takes a lot of time and effort, it needs a flexible and reasonable application of the organizers - the
training staff and coordination collaborator, but brings about many advantages. For instance, it
brings a lot of input data so that the English center can understand the wishes and feedback of
students in order to gradually improve their course organization activities and the organizers can
also evaluate realistically and comprehensively the effect of the course brings not only individual
learners but also their contribution to the work and our company.

The situation of the old English center (2019): Each person has specialized tasks as a result the
staff do not coordinate effectively with each other and work without procedure.

56
APPENDIX D: CAPSTONE DELIVERABLE EVALUATION

57
REFERENCES

Alliger, G. M., and Janek,E.A.1989. Kirkpatrick’s levels of training criteria: thirty years later.
Personnel Psychology,42:331–42.
Alliger, G. M., Tannenbaum, S. I., Bennett, W., Traver, H., and Shotland,A.1997. A meta-analysis
of the relations among training criteria. Personnel Psychology,50:341–58
Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, D. A., Mayer, R. E., Pintrich, P.
R., Raths, J., and Wittrock,M.C. 2001.A Taxonomy for Learning and Assessing: A Revision
of Bloom’s Taxonomy of Educational Objectives. New York: Longman
Baldwin, T. T., and Ford,J.K.1988. Transfer of training: a review and directions for future research.
Personnel Psychology,41:63–105.
Bloom, B.1956. Taxonomy of Learning Objectives: The Cognitive Domain. New York:
DonaldMcKay
Boudreau,J.W 1984. Decision theory contributions to HRM research and practice. Industrial
Relations,23:198–217
Boudreau,J.W 1988. Utility analysis: a new perspective on human resource management decision
making. Pp. 125–86 in Human Resource Management: Evolving Roles and
Responsibilities,ed. L. Dyer. Washington, DC: Bureau of National Affairs
Boudreau,J.W 1991. Utility analysis for decisions in human resource management. Pp. 621–745
in Handbook of Industrial and Organizational Psychology, vol. iii, ed. M. D. Dunnette
andL. M. Hough. Palo Alto, CA: Consulting Psychologists Press
Boudreau,J.W.1983. Economic considerations in estimating the utility of human resource
productivity improvement programs. Personnel Psychology,36:551–76.
Bracken,D.W.,Timmreck, C. W., and Church, A. H., eds. 2001.The Handbook of Multisource
Feedback. San Francisco: Jossey-Bass
Brown,K.2005. An examination of the structure and nomological network of trainee reactions: a
closer look at the ‘‘smile sheets.’’ Journal of Applied Psychology,90:991–1001
Campbell,J.P. 1999. The definition and measurement of performance in the new age. Pp. 399–429
in The Changing Nature of Performance, ed. D. R. Ilgen and E. D. Pulakos. San
Francisco:Jossey-Bass
Campbell,J.P.1990. An overview of the army selection and classification project (ProjectA).
Personnel Psychology,43:231–9
Cascio,W.F 1989. Utility analysis as an evaluation tool. Pp. 63–88 in Training and Development
in Organizations, ed. I. L. Goldstein. San Francisco: Jossey-Bass.
Cascio,W.F.1982. Costing Human Resources: The Financial Impact of Behavior in Organizations.
Boston: Kent Publishing Co.

60
Cutcher-Gershenfeld, J., and Ford, J. K. 2005.Valuable Disconnects in OrganizationalLearning
Systems: Integrating Bold Vision and Harsh Realities. New York: Oxford Univer-sity
Press.
Davis, M. A., Curtis, M. B., and Tschetter, J.D.2003. Evaluating cognitive training outcomes:
validity and utility of structural knowledge assessment. Journal of Business and
Psychology,18:191–206
Ford, J. K., Kraiger, K., and Merritt, S. (in press). An updated review of the multi-dimensionality
of training outcomes: new directions for training evaluation research. In Learning,
Training, and Development in Organizations, ed. S. W. J. Kozlowski and E. Salas.
Mahwah, NJ: LEA
Friedman,T.J.2005.The World if Flat. New York: Farrar, Straus and Giroux
Goldstein, I. L., and Ford,J.K.2002.Training in Organizations,4th edn. Belmont, CA:Wadsworth
Ilgen, D. R., and Pulakos,E.D.1999.The Changing Nature of Performance. San Francisco:Jossey-
Bass.
Jellma, F., Visscher, A., and Scheerens,J.2006. Measuring change in work behavior by means of
multisource feedback. International Journal of Training and Development,10:121–39
Johnson, A. (2009). The rise of English: The language of globalization in China and the European
Union. Macalester International, 22(1), 12. Retrieved from
http://digitalcommons.macalester.edu/cgi/viewcontent. cgi?article=1447
&context=macint
Kirkpatrick,D.L.1959. Techniques for evaluating training programs. Journal of the American
Society of Training Directors,13:3–32.
Kozlowski, S. W. J., and Bell,B.S.2006. Disentangling achievement orientation and goalsetting:
effects on self-regulatory processes. Journal of Applied Psychology,91:900–16.
Kraiger K. 2002. Decision-based evaluation. Pp. 331–75 in Creating, Implementing and
Maintaining Effective Training and Development: State-of-the-art Lessons for Practice,
ed.K. Kraiger. San Francisco: Jossey-Bass.
Kraiger K., Ford, J. K., and Salas,E.1993. Application of cognitive, skill-based, and affective
theories of learning outcomes to new methods of training evaluation. Journal of Applied
Psychology,78:311–28.
Mathieu,J.E.and Leonard R. L., Jr. 1987. Applying utility concepts to a training programin
supervisory skills: a time based approach. Academy of Management Journal,30:316–35
McGehee, W., and Thayer,P.1961.Training in Business and Industry. New York: McGrawHill
McGehee, W., and Thayer,P.1961.Training in Business and Industry. New York: McGrawHill
Misko, J. (1995). Transfer: Using Learning in New Contexts. Leabrook, Australia: NCVER

61
Misko, J. (1999). The transfer of knowledge and skill to different contexts: An empirical
perspective: NCVER, Leabrook SA, Australia
Morgan, R. B., and Casper,W.2000. Examining the factor structure of participant reactions to
training: a multidimensional approach. Human Resource Development Quarterly,11:301–
17.
Morrow, C. C., Jarrett, M. Q., and Rupinski,M.T.1997. An investigation of the effect and economic
utility of corporate-wide training. Personnel Psychology,50:91–119.
Motowidlo, S. J., and Schmit,M.J.1999. Performance assessment in unique jobs. Pp.56–86 in The
Changing Nature of Performance, ed. D. R. Ilgen and E. D. Pulakos. SanFrancisco: Jossey-
Bass
Noe, R.A., Hollenbeck J.R., Gerhart B. and Wright P. M. 2016. Fundamentals of Human Resource
Management. New York: McGrawHill
Perkins, J.N. and Salomon, G. (1996). Learning transfer. In A. Tuijman, (Ed.) Industrial
psychology (2nd edition). London: Routledge.
Phillips, J. J., and Phillips, P. P., eds. 2005.ROI at Work: Best Practice Case Studies from the Real
World. Alexandria, VA: American Society for Training and Development
Pulakos, E. D., Arad, S., Donovan, M. A., and Plamondon,K.E.2000. Adaptability in the
workplace: development of a taxonomy of adaptive performance. Journal of Applied
Psychology,85:612–24
Raju, N. S., Burke, M. J., and Normand,J.1990. A new approach for utility analysis. Journal of
Applied Psychology,75:3–12
Schmidt, F. L., Hunter, J. E., and Pearlman,K.1982. Assessing the economic impact of personnel
programs on workforce productivity. Personnel Psychology,35:333–47.
Schneider, B., and Schmitt,N.1986.Staffing Organizations,2nd edn., Glenview, IL:
Scott,Foresman
Subedi, B. S. 2004. Emerging Trends of Research on Transfer of Learning. International
Education Journal, 5(4): 591-99
Sugrue, B., and Kim,K.2004.State of the Industry: ASTD’s Annual Review of Trends in Workplace
Learning and Performance. Alexandria, VA: American Society of Training and
Development
Thayer,P.W.1997. A rapidly changing world: some implications for training systems inthe Year
2001 and beyond. Pp. 15–30 in Training for a Rapidly Changing Workplace, ed.M. A.
Quinones and A. Ehrenstein. Washington, DC: APA.
Tuijnman, A.C. (ed.) (1996). International Encyclopedia of Adult Education and Training (2nd
edition). London: Pergamon and Elsevier Science.

62
Wall, T. D., and Jackson,P.R.1995. New manufacturing initiatives and shopXoor jobdesign. Pp.
139–74 in. The Changing Nature of Work, ed. A. Howard. San Francisco:Jossey-Bass
Warr, P., and Bunce,D.1995. Trainee characteristics and the outcomes of open learning. Personnel
Psychology,48:347–75.
Wexley, K., and Latham,G.P.2002.Developing and Training Human Resources in
Organizations,3rd edn. Upper Saddle River, NJ: Prentice Hall.

63

You might also like