Professional Documents
Culture Documents
1 s2.0 S074756329900031X Main
1 s2.0 S074756329900031X Main
1 s2.0 S074756329900031X Main
Abstract
Training problem solving teams presents a unique set of challenges to instructional design-
ers. Given the criticality of teams to the performance of many organizations, it is important to
develop training systems to support coordination and problem solving. While recent techno-
logical advancements, such as computer-based performance assessment, hold considerable
promise for training, the introduction of technology alone does not guarantee that the train-
ing will be eective. This article focuses on three important questions that must be addressed
when developing coordination and problem solving training: (1) How can technology best be
used to provide an environment in which learning can take place? (2) What knowledge, skills,
and attitudes need to be learned and trained to facilitate expert problem solving teams? and
(3) How can the development of problem solving expertise be fostered through a systematic
instructional strategy? A naturalistic decision making paradigm will be used to provide a
theoretical framework for describing the problem solving task domain. An event-based
approach to training will then be oered as a practical application for training teams to per-
form in naturalistic environments. Finally, conclusions are drawn regarding the provided
framework for team learning. Published by Elsevier Science Ltd.
Keywords: Teams; Team training; Problem solving; Instructional strategies; Scenario design
1. Introduction
The necessity for training problem solving skills has been recognized for some
time (Bettenhausen & Murnighan, 1985; Cannon-Bowers, Salas, & Converse, 1993).
These skills are crucial for both individuals and teams to be able to cope with the
* Corresponding author.
$
The views herein are those of the authors and may not re¯ect the opinion, policy or views of the
Department of Defense.
dynamic environments of today's world (Baxter & Glaser, 1997; O'Neil, Allred, &
Baker, 1997; Orasanu & Connolly, 1995). For example, the trend in modern warfare
is toward an increasing reliance on distributed systems. Concepts such as `network-
centric' warfare suggest that future missions will be accomplished by cross-domain
teams who are physically dispersed. Given the criticality of these team's missions it is
important to develop training systems and programs to support coordination and
problem solving in these environments. To respond to this training need, it is
necessary to answer three questions: (1) To what extent can technology provide an
environment in which learning can take place? (2) What knowledge, skills, and atti-
tudes need to be learned and trained to facilitate expert problem solving teams? and
(3) How can the development of problem solving expertise be fostered through a
systematic instructional strategy?
Prior to a discussion of how and what to train to develop problem solving exper-
tise in teams, it is helpful to ®rst de®ne what is meant by the construct. While deci-
sion making may simply be de®ned as the selection of an alternative from a larger
set (Kerr, 1981; Stasser & Davis, 1981), a more appropriate de®nition would include
the processes of problem identi®cation and evaluation as well as choice (Huber,
1980). Problem solving moves beyond even this expanded de®nition of decision
making to include the execution and monitoring of the selected alternative (Cannon-
Bowers & Bell, 1997; Hirokawa & Johnston, 1989; Means, Salas, Crandell, &
Jacobs, 1995).
The above discussion provides a brief overview of the importance of training of
problem solving teams. The tenets of naturalistic decision making (NDM) (Orasanu
& Connolly, 1995) will be used to develop a theoretical framework for exam-
ining what needs to be trained to establish eective problem solving teams. An
event-based approach to training (EBAT) (Oser, Cannon-Bowers, Dwyer, &
Salas, 1997), which can be described in terms of the naturalistic decision making
paradigm, will serve as a practical application for training teams. The following
three sections will address these issues in greater detail. First, the use of simulation
for training will be examined. Next, the naturalistic decision making paradigm will
be used to argue for the necessity of simulations to train team problem solving.
Third, the utility of an EBAT for training problem solving skills will be explored
(Oser, Cannon-Bowers, Dwyer, & Miller, 1997). Finally, conclusions based on the
literature will be drawn.
learning will result (Salas & Cannon-Bowers, 1997). In fact, few eorts have focused
on how to best use these systems to establish eective environments to support
learning (Dwyer, Fowlkes, Oser, Salas, & Lane, 1997). In the absence of a sound
learning methodology, it is possible that a training system may not fully achieve
their intended objectives.
Eective training environments are systems that facilitate the ability of the parti-
cipants to develop and maintain the competencies (i.e. knowledges, skills, attitudes)
necessary to perform required tasks. Learning environments must enable the
employment of systematic, deliberate approaches to achieve critical task require-
ments of the training participants. These approaches need to support: (1) all phases
of training development (e.g. planning, preparation, execution, analysis); (2) per-
formance measurement; and (3) feedback.
One training technology that has the potential for establishing eective learning
environments for the enhancement of team problem solving expertise is simulation
(Cannon-Bowers & Bell, 1997; Means et al., 1995). Simulations are ideally suited
for training problem solving through exposure of teams to the types of environ-
ments and problems that may be confronted in a naturalistic setting. In addition,
the use of computers to develop and present simulations allows for a high level of
stimulus control and structured repetition (Oser, Cannon-Bowers, Dwyer, & Miller,
1997). Simulations can allow the instructional developer to manipulate and
pre-specify the presentation of, and relationship between, critical environmental
features. Within a simulated environment, it also is possible for the team to conduct
problem solving in a realistic manner. For example, a problem solving team
could implement alternatives as well as monitor their execution (Gualtieri, Parker,
& Zaccaro, 1995). However, for this training environment to be eective, it not
only requires a high ®delity representation of the problem solving domain, but a
systematic instructional methodology as well (Oser, Cannon-Bowers, Dwyer, &
Salas, 1997).
While the above discussion focused on the use of simulations as a tool for estab-
lishing learning environments, the current section will focus on what competencies
(i.e. knowledge, skills, abilities) to train. The characteristics of eective problem
solvers delineated by Cannon-Bowers and Bell (1997) (i.e. ¯exibility, quickness,
resilience, adaptability, risk taking, accuracy) provide an end state for training. In
essence, these characteristics describe the capabilities that problem solving teams are
expected to possess to be eective in real world situations (Cannon-Bowers & Bell,
1997). While these characteristics are useful at a global level, they need to be more
explicitly de®ned to be directly trainable. Cannon-Bowers and Bell (1997) identi®ed
six cognitive process and skill areas considered critical for problem solving; four of
these (i.e. information processing, situation assessment, reasoning, monitoring) will
be examined in detail in the following sections, as they are most applicable to a
simulation setting.
444 R.L. Oser et al. / Computers in Human Behavior 15 (1999) 441±462
One of the key features that dierentiates experts from novices is their ability to
accurately encode and categorize information (Glaser, 1986). This better categor-
ization of information leads to a better organization of that knowledge for storage
(Glaser, 1989). Similarly, Druckman and Bjork (1991) noted that experts not only
possess more knowledge than novices but also organize it more eciently. Typically,
novices' knowledge organization is incomplete or inadequate for the task domain
(Glaser, 1986). An expert's knowledge organization is characterized by complex
knowledge structures, process and goal states, and solution templates (Glaser &
Chi, 1988).
The presence of these highly organized knowledge structures enables experts to
easily encode new information into their internal framework (Glaser, 1986), rapidly
retrieve knowledge when required (Druckman & Bjork, 1991), and implement a
speci®c solution to the problem (Klein, 1989). The development of an organized
knowledge base that enables decisions to be made quickly by the team is important
for teams that must perform in naturalistic decision making environments (Cannon-
Bowers & Bell, 1997).
Of particular relevance to performing in naturalistic decision making environ-
ments is the use of knowledge templates (Noble, Grosz, & Boehm-Davis, 1987).
These templates are initiated in response to the recognition of situational cues by
individuals. According to Noble et al. (1987), people recognize cues in a new situa-
tion and match them back to previously encountered situations. The assumption
being that those actions that were successful in previous situations will also be suc-
cessful in the current situation. This process will be discussed further in the next
section (i.e. situation assessment). Noble (1995) contends that these templates con-
tain an objective feature (i.e. a speci®c goal state to be achieved), action features (i.e.
a particular process to achieve that goal), and environmental features (i.e. criteria
that are used to bound the problem).
The development of problem templates is particularly important in teams. If team
members possess dierent templates, then there are likely to be coordination
problems during task execution (Cannon-Bowers, Salas, & Converse, 1990). How-
ever, if an EBAT framework is adopted, then the individuals are able to consider
not only their own tasks but also the tasks of the other team members in a sys-
tematic way (Oser, Cannon-Bowers, Dwyer, & Salas, 1997). The ability of indivi-
dual team members to call upon the same template is to some extent dependent on
the plan, which was developed prior to task execution. Therefore, planning is a
crucial skill for team problem solving that will be discussed in the reasoning skills
subsection.
The aim of situation assessment is to understand the causes of the cues observed
(Rasmussen, 1983). The ability to recognize meaningful patterns within the problem
space is another one of the key abilities that experts possess (Glaser, 1986). These
R.L. Oser et al. / Computers in Human Behavior 15 (1999) 441±462 445
patterns help to reduce the expert's workload, process information, encode knowl-
edge, and facilitate rapid responses (Means et al., 1995).
In a review of naturalistic decision making literature, Lipshitz (1995) found that
all of the theories that he examined included some element of situation assessment.
In addition, several of these theories demonstrated that experts are better able to
perform situation assessment quicker and more accurately than novices (Beach &
Mitchell, 1987; Klein, 1989; Noble, 1989; Rasmussen, 1983). Lipshitz (1995) pro-
posed that the improved situational assessment capabilities of experts account for
much of their ability to solve problems quicker and with greater accuracy compared
to novices.
Based on this research, Cannon-Bowers and Bell (1997) identi®ed two skills
necessary for situation assessment: cue recognition and pattern recognition. Cues
are those stimuli within the environment that provide information to the problem
solver regarding the current or future state of the system (Noble et al., 1987).
Researchers have suggested that experts recognize problem solving cues dierently
than novices (Cannon-Bowers & Bell, 1997). For example, novices attend to the
surface features of a problem, while experts attend to the underlying deep structure
(Chi, Feltovitch, & Glaser, 1981).
Other researchers have discovered that experts can detect important features in
the environment more readily than novices (e.g. Druckman & Bjork, 1991; Means et
al., 1995). Experts seek additional information to validate their problem repre-
sentation, while novices rely only on the information that is readily available (Cel-
lier, Eyrolle, & Marine, 1997). Therefore, experts tend to take more time to build
their problem representation than do novices (Dorner & Schaub, 1994). Finally,
Orasanu and Connolly (1995) stated that experts seem to be better able to frame
problems which enables them to better detect the problem's underlying cues and
structure. These ®ndings suggest that cue recognition signi®cantly contributes to the
skill of situation awareness.
While the identi®cation of individual cues is important, research involving per-
formance in naturalistic decision making environments has suggested that it is not
just single isolated cues that are important, but rather patterns of cues (Klein, 1995).
Earlier Wickens (1984) noted that the underlying patterns that are learned through
training or experience lead to an internal representation of the environment. This
network allows experts to develop causal structures for predicting the future state of
the system (Wickens, 1984).
More recently, Cellier et al. (1997) have proposed that because of interactions
among the cues, understanding requires the construction of relationship patterns
that are suciently detailed to construct causal relationships. While patterns
may be nothing more than the `chunking' of individual cues (Chase & Simon,
1973), this ability to aggregate cues greatly increases the speed and accuracy of
expert decisions (Means et al., 1995). Druckman and Bjork (1991) also reported
that experts were able to recognize patterns of cues more rapidly than novices.
Klein (1995) has stated that the ability to separate relevant cue patterns from the
background noise is what allows experienced problem solvers to assess complex
situations quickly. The ability to perform the initial steps of the problem solving
446 R.L. Oser et al. / Computers in Human Behavior 15 (1999) 441±462
3.4. Monitoring
Monitoring deals with the control of a system and the detection of system faults
(Cellier et al., 1997). To eectively monitor a system, the team must be familiar with
normal state of the system as well as potential faulty system states. As was noted
earlier, Cannon-Bowers and Bell (1997) emphasized the importance of ¯exibility and
adaptability for successful problem solving. Lipshitz (1995) also addressed this issue
in his comparisons of emerging naturalistic decision making theories and classical
decision theories. According to Lipshitz (1995), no single set of processes or proce-
dures could accurately represent the decision making process in naturalistic envir-
onments. Similarly, Mintzberg, Raisinghani and Theoret (1976) found that the
problem solving processes were highly ``unstructured'' and not amenable to a single
procedural framework.
Because of the ¯uid nature of naturalistic domains, the problem solving team must
dynamically assess and regulate their solution strategy (Cannon-Bowers & Bell,
1997). That is, teams must continuously collect information from the system to
ensure that the goals speci®ed in the initial decision are still being met. One strategy
for guaranteeing that these goals are reached is for the individual team members to
attempt to adopt a metacognitive perspective. Metacognition is a skill that con-
tributes to an expert's ability to remain ¯exible during the execution of a decision
(Cohen, Freeman, Wolf, & Militello, 1995).
Several theorists have suggested that experts appear to be better able to monitor
their own process during problem solving better than novices (Chi, Bassock, Lewis,
Reinmann, & Glaser, 1989; Druckman & Bjork, 1991). Glaser (1986) has suggested
that novices need to be trained self-regulatory and self-management skills during the
training of problem solving skills. He goes on to assert that these skills are primarily
acquired through experience. Experts are able to monitor their own learning so that
448 R.L. Oser et al. / Computers in Human Behavior 15 (1999) 441±462
they can manage their experiences (Glaser, 1986). This self-regulation capability
fosters improved knowledge acquisition and encoding. Similarly, Orasanu (1990)
noted that eective metacognitive skills allow problem solvers to better manage
resources because they have a more accurate picture of their own strengths and
weaknesses.
Simulation can be an eective means to enhance monitoring performance by
presenting trainees with a wide variety of situations that may be encountered in
real-world settings. Because diering system states can be demonstrated using
simulation, trainees can be presented with conditions that are characteristic of
normal and abnormal conditions. This can provide trainees with opportunities to
practice dynamic assessment of critical environmental cues. Simulation can also
facilitate the development of meta-cognitive (i.e. self-regulatory) behaviors asso-
ciated with expert performance. The recording and replay capabilities of sim-
ulations can be used to demonstrate how an individual managed their problem
solving resources. This can provide important information for the development of
self-regulatory behavior.
3.4.1. Summary
While the above discussion highlights some the most important knowledge, skills,
and abilities for training in problem solving, it is certainly not exhaustive. A number
of other articles provide a discussion of additional knowledge, skills, and abilities
that could aid in the training of problem solving teams performing in naturalistic
decision making environments (Cannon-Bowers & Bell, 1997; Cannon-Bowers,
Tannenbaum, Salas, & Volpe, 1995). Table 1 provides a summary of implications
for training problem solving teams via simulation for each of the skills discussed in
the previous section.
Table 1
Implications of simulation for training team problem solving skills
Information Simulation can provide a shared environment Cannon-Bowers & Bell, 1997;
processing skills which fosters similar templates Cannon-Bowers et al., 1990;
Encoding Similar experiences within a simulation can Oser, Cannon-Bowers,
Storage enable team members to have consistent Dwyer, & Salas, 1997
Retrieval knowledge organizations that lead to the
development of common goals
Situation awareness Simulation is an excellent environment in Cannon-Bowers & Bell, 1997;
Cue recognition which to receive practice Klein, 1995; Means et
Template recognition Simulation can provide many more trials al., 1995; Noble et al., 1987
than would be possible in a natural setting
Simulation can highlight speci®c patterns
Problem solving skills Simulation can provide a safe setting to Cannon-Bowers & Bell, 1997;
Domain speci®c skills practice problem solving in complex Dorner & Schaub, 1994;
dynamic environments Means et al., 1995; Oser,
By providing multiple practice opportunities, Cannon-Bowers, Dwyer, &
simulation can accelerate team member Miller, 1997
pro®ciency
Opportunities for feedback can be established
in simulation environments
Monitoring Simulation can be used to provide examples Cannon-Bowers & Bell, 1997;
Detecting faults of normal system states Cohen et al., 1992;
Metacognition Team members are able to practice self- Orasanu, 1990
regulating behaviors within a simulation
environment
analyses of training needs (Goldstein, 1993; Wong & Raulerson, 1974). In a review
of ISD, Whitmore (1981) stated that it focuses on: (1) developing eective perfor-
mance; (2) learning required skills; and (3) identifying what the trainee must do to
learn a skill. ISD is primarily behavioral in orientation, that is, ISD focuses on
learning skills rather then the development of knowledge (Wong & Raulerson,
1974).
Assessment centers, in contrast, were developed for the purpose of observing
and evaluating complex human behaviors, like leadership ability (McKinnon,
1975), for the primary purpose of selection. This methodology presents individuals
with situational tests designed to provide an opportunity to demonstrate a speci®c
skill. A review by Klimoski and Brickner (1987) found that assessment centers
are highly ¯exible and provide valuable information for evaluating performance
in a variety of domains. Due to these, and other, characteristics (e.g. high predictive
validity) assessment centers are used to make selection decisions, determine indivi-
dual training needs, and facilitate human resource planning (Casio, 1991).
Unfortunately, assessment centers often lack a formal structure on which to make
judgments. Gaugler and Thorton (1989) found that raters utilized only a small sub-
set of the information available to them to make the ratings. A second potential
450 R.L. Oser et al. / Computers in Human Behavior 15 (1999) 441±462
Eective training begins with a clear understanding of the needs of the training
participants. The EBAT starts with the speci®cation of learning objectives asso-
R.L. Oser et al. / Computers in Human Behavior 15 (1999) 441±462 451
ciated with domain relevant tasks, conditions, and standards. For purposes of this
discussion, a learning objective is de®ned as a statement that clearly describes what
is to be acquired during a training opportunity. Depending on the training partici-
pants and task requirements, learning objectives can be associated with a speci®c
task (e.g. select a construction site for a new distribution center) or general compe-
tencies required across a number of tasks (e.g. problem solving, planning, commu-
nication).
The capabilities described below are the end goals of training and provide a basis
for specifying the knowledge, skills and abilities that need to be trained and sup-
ported within a simulation. Technically, the capabilities discussed in this section are
the learning objectives that a instructional designer would want to specify as learn-
ing objectives. These capabilities provide guidance for development of an integrated
framework for an EBAT system (e.g. scenario events, measurement tools, feedback
format).
To train problem solving teams it is ®rst necessary to determine what are the
characteristics of naturalistic decision making environments (Cannon-Bowers &
Bell, 1997). Orasanu and Connolly (1995) listed eight factors that were important to
consider in naturalistic settings: (1) ill-structured problems; (2) dynamic environ-
ments; (3) shifting or competing goals; (4) feedback loops; (5) time stress; (6) high
stakes; (7) multiple players; and (8) overriding organizational goals. These factors
delineate the conditions under which a problem solving team must perform. Using
the factors that Orasanu and Connolly (1995) identi®ed as characteristics of natur-
alistic environments, Cannon-Bowers and Bell (1997) identi®ed a set of capabilities
that expert problem solvers would be expected to possess (i.e. ¯exibility, quickness,
resilience, adaptability, risk taking, and accuracy).
Expert problem solvers need to be ¯exible to cope with the complex, dynamic and
ill-structured environment in which they ®nd themselves (Orasanu & Connolly,
1995). As the environment changes, goals shift, potentially requiring changes in
strategy (Kauman, 1995). The need for ¯exibility is further tasked by the need to
address the perspectives of each of the team members in the implementation of a
solution (Vroom & Yetton, 1973).
In addition to being ¯exible, problem solvers need to be quick. Time pressure is
one of the primary factors in naturalistic environments and limits the ability of the
team to conduct analytical procedures (e.g. multi-attribute utility analysis) (Zakay &
Wooler, 1984). The presence of feedback loops and dynamic environments also
impacts the tempo at which a decision must be made (Orasanu & Connolly, 1995).
Because of the potentially severe consequence for inaccurate alternative selections
and severe time pressure, problem solvers must be resilient (Cannon-Bowers & Bell,
1997). These stressors along with the necessity of handling ill-structured problems in
dynamic environments requires that the problem solver must be capable of operating
under these conditions, without degradation in performance (Means et al., 1995).
Each team must optimize organizational goals as well as personal requirements
(Hackman, 1987). These factors make adaptation particularly dicult (Orasanu &
Connolly, 1995). On top of the structural characteristics of the team (i.e. organiza-
tional goals, multiple players) that obligate adaptation are the task characteristics
452 R.L. Oser et al. / Computers in Human Behavior 15 (1999) 441±462
(i.e. dynamic environment, shifting goals, feedback loops) that place demands on the
problem solver (Cannon-Bowers & Bell, 1997). Adaptation by the problem solver is
necessary to ensure survival as well as expand the problem solver's behavioral
repertoire (Holland, 1993).
Because of the nature of the tasks that a problem solver is required to perform, a
certain level of risk taking is required (Cannon-Bowers & Bell, 1997). Orasanu and
Connolly (1995) have suggested that problem solvers mitigate some of this risk by
drawing on their considerable knowledge of the problem domain. This knowledge
base allows the expert problem solver to better categorize ill-structured problems
more eciently than a novice with the domain (Glaser, Lesgold, & Gott, 1991).
The ®nal capability that Cannon-Bowers and Bell (1997) identi®ed as being cri-
tical to problem solving in naturalistic environments is accuracy. Because of the high
stakes involved, problem solvers do not become experts unless they are able to
accurately predict future states of the system. Performance on any given situational
trial impacts the ability of the problem solver to learn which in turn impact perfor-
mance on future trials (Holland, 1993).
A summary of the relationship between the problem solving capabilities identi®ed
by Cannon-Bowers and Bell (1997) and the eight characteristics of naturalistic
environments speci®ed by Orasanu and Connolly (1995) are presented in Table 2.
Table 2
Impact of naturalistic decision making characteristics on problem solving capabilities
Ill-structured problems + + +
Dynamic environments + + + +
Shifting goals + +
Feedback loops + +
Time stress + +
High stakes + + +
Multiple players + +
Organizational goals +
R.L. Oser et al. / Computers in Human Behavior 15 (1999) 441±462 453
associated with how a given outcome was reached. The nature of performance in
naturalistic decision making environments further increases the need to measure
processes. This is because the team interacts with the dynamic and ill-structured
environment in which it is embedded, and it is necessary to have measures that will
enable observers to record the team's behaviors.
A second impact of the naturalistic decision making paradigm on the speci®ca-
tion of MOEs is related to the types of tools that are utilized to assess team pro-
cesses and performance. While structured environments allow for highly
proceduralized checklists, the very nature of naturalistic environments limits the
ability to be assessed using standard operating procedures (i.e. dierent methods can
result in eective performance). This has implications for training observers to
evaluate the construct of interest. Finally, because naturalistic decision making
environments involve shifting goals and feedback loops there is a need for long-
itudinal observation.
4.1.8. Summary
Because of the linkages in the EBAT, performance can be traced directly back
to speci®c learning objectives via events and performance measures. Performance
related to a given objective can then be assessed and fed back to the training parti-
cipants. The linkages are critical, and therefore must not be viewed as a set of
options, but rather a list of requirements for eective training. If properly imple-
mented, the EBAT has the potential to establish an eective learning environment
for team training in simulated environments. Table 3 summarizes the implications of
an EBAT for training the team problem solving skills required in naturalistic deci-
sion making environments.
Table 3
Implications of event-based approach to training (EBAT) and naturalistic decision making for training problem solving teams
Learning objectives Trainer should utilize expert problem solver capabilities to set learning objectives Cannon-Bowers & Bell, 1997;
Simulation should try to mimic naturalistic environment and exercise team Oser, Cannon-Bowers, Dwyer, &
problem solving skills Miller, 1997; Oser Cannon-
Bowers, Dwyer, & Salas, 1997
Trigger events A naturalistic decision making framework provides a means for manipulating Cannon-Bowers & Bell, 1997;
scenario event Orasanu & Connolly, 1995;
Scenario designers should use the characteristics of naturalistic decision making Oser, Cannon-Bowers, Dwyer,
environments to ensure that number and type of trigger events selected accurately & Miller, 1997; Oser Cannon-
re¯ect real world situations Bowers, Dwyer, & Salas, 1997
MOEa speci®cation Care should be taken in developing measures for naturalistic environments Dwyer et al., 1997; Johnston
because they are characteristically ill-structured et al., 1997; Oser, Cannon-
Naturalistic environments limit the utility of highly procedural measurement tools Bowers, Dwyer, & Miller, 1997;
and, therefore, should not be used Oser Cannon-Bowers, Dwyer, &
Salas, 1997
Scenario generation When the learning objectives relate to problem solving, greater attention should Cannon-Bowers and Bell, 1997;
be paid to issues of simulator ®delity than for more behavioral learning objectives Oser, Cannon-Bowers, Dwyer,
Scenario designers should use a naturalistic decision making framework to classify & Miller, 1997; Oser Cannon-
and evaluate the diculty of individual events Bowers, Dwyer, & Salas, 1997
Exercise management Greater attention needs to be provided to exercise management when the scenario Oser, Cannon-Bowers, Dwyer,
possesses the characteristics of a naturalistic environment & Miller, 1997; Oser Cannon-
Due to their complexity, simulations that attempt to represent naturalistic Bowers, Dwyer, & Salas, 1997
R.L. Oser et al. / Computers in Human Behavior 15 (1999) 441±462
Table 3Ðcontinued
Data collection Observers that collect data within a naturalistic environment should possess tools Dwyer et al., 1997; Johnston
that help them maintain an awareness of the big picture et al., 1997; Oser, Cannon-
Simulation should possess some form of automatic data capture Bowers, Dwyer, & Miller, 1997;
Oser Cannon-Bowers, Dwyer,
& Salas, 1997
AARb generation Because naturalistic decision making environments are so complex, feedback Means et al., 1995;
should be provided in as timely a fashion as possible to foster learning Schneider, 1985
Simulation should be utilized during feedback sessions to help the trainee
visualize the event
Data management Archival tools should employ meta-tags that categorize scenario events using a Oser, Cannon-Bowers, Dwyer,
and archival naturalistic decision making framework & Miller, 1997; Oser Cannon-
Archived databases should include prior performance information to help Bowers, Dwyer, & Salas, 1997
determine normative values for scenario events
a
MOE, measure of eectiveness.
b
AAR, after-action review.
R.L. Oser et al. / Computers in Human Behavior 15 (1999) 441±462
R.L. Oser et al. / Computers in Human Behavior 15 (1999) 441±462 459
5. Conclusions
As the cost of preparing teams for the work place increases, the use of simulations
for training is likely to become more prevalent. For many domains (e.g. military,
emergency response teams, commercial airlines, nuclear power plants), simulation
will become the training environments of choice. While technological improvements
to training systems are likely to continue, they alone can not ensure that trainees will
possess improved competencies (i.e. knowledge, skills, abilities) after completing
training. Additional work in the development of learning strategies, methods, and
tools is needed to maximize the bene®t of using simulation for training.
This article has sought to provide a framework with which to develop these
learning strategies, methods, and tools for the training of problem solving teams. It
should be noted, however, that most of the research that served as the basis for this
article is theoretical, and empirical assessment of the propositions are still required.
While both a naturalistic decision making and EBAT framework have tremendous
potential for impacting the training of problem solving teams, until an evaluation of
their tenets is conducted, the conclusions derived from them are conjectural.
In addition to providing empirical support for the use of the EBAT to train
problem solving for naturalistic decision making environments, work remains for
further theoretical development for the identi®cation of learning objectives, devel-
opment of scenario events and exercises, measurement of performance, and appli-
cation of appropriate feedback within a training simulation. Other improvements
could be achieved by addressing training needs during the technical development of
simulations to ensure the overall architecture as well as the models, protocols, and
algorithms all support learning. The establishment of eective learning environ-
ments is an important ®rst step to fully utilizing simulation to support training.
In summary, while emerging technologies have considerable potential for training,
technology itself will not result in eective learning environments. Eective learning
environments require the integration of technologies with theoretically sound and
empirically demonstrated learning strategies, tools, and methods.
References
Amalberti, R., & Deblon, F. (1992). Cognitive modeling of ®ghter aircraft's process control: A step
towards an intelligent on-board assistance system. International Journal of Man Machine Studies, 36,
639±671.
Baxter, G. P., & Glaser, R. (1997). An approach to analyzing the cognitive complexity of science perfor-
mance assessments (CSE Technical Report 452). Los Angeles, CA: University of California, Graduate
School of Education and Information Studies.
Beach, L. R., & Mitchell, T. R. (1987). Image theory: Principles, goals and plans. Acta Psychologica, 66,
201±220.
Bell, H. H. (1996). The engineering of a training network. Proceedings of the 7th International Training
Equipment Conference (pp. 365±370). Arlington, VA: ITED Ltd.
Bettenhausen, K., & Murnighan, J. K. (1985). The emergence of norms in competitive decision-making
groups. Administrative Science Quarterly, 30, 350±372.
Bisseret, A. (1981). Application of signal detection theory to decision making in supervisory control: the
eect of the operator's expertise. Ergonomics, 24 (2), 81±94.
460 R.L. Oser et al. / Computers in Human Behavior 15 (1999) 441±462
Cannon-Bowers, J. A., & Bell, H. H. (1997). Training decision makers for complex environments: impli-
cations of the naturalistic decision making perspective. In C. Zsambok, & G. Klein, Naturalistic deci-
sion making (pp. 99±110). Mahwah, NJ: Erlbaum.
Cannon-Bowers, J. A., & Salas, E. (1997). A framework for developing team performances measures in
training. In M.Brannick, E. Salas, & C. Prince, Team performance assessment and measurement: theory,
research and applications (pp. 15±62). Mahwah, NJ: Erlbaum.
Cannon-Bowers, J. A., Salas, E., & Converse, S. (1990). Cognitive psychology and team training: training
shared mental models of complex systems. Human Factors Society Bulletin, 33 (12), 1±4.
Cannon-Bowers, J. A., Salas, E., & Converse, S. (1993). Shared mental models in expert team
decision making. In N. Castellan, Individual and group decision making (pp. 221±246). Hillsdale, NJ:
Erlbaum.
Cannon-Bowers, J. A., Tannenbaum, S. I., Salas, E., & Volpe, C. E. (1995). De®ning team competencies
and establishing team training requirements. In R. Guzzo, & E. Salas, Team eectiveness and decision
making in organizations (pp. 333±380). San Francisco, CA: Jossey-Bass.
Casio, W. F. (1991). Applied psychology in personnel management. Englewood Clis, NJ: Prentice Hall.
Cellier, J. M., Eyrolle, H., & Marine, C. (1997). Expertise in dynamic environments. Ergonomics, 40 (1),
28±50.
Chi, M., Bassock, M., Lewis, M., Reinmann, X., & Glaser, R. (1989). Self-explanations: how students
study and use examples in learning to solve problems. Cognitive Science, 13, 145±182.
Chi, M., Feltovitch, P. J., & Glaser, R. (1981). Categorization and representation of physics problems by
experts and novices. Cognitive Science, 5, 121±152.
Chase, W. G., & Simon, H. A. (1973). Perceptions in chess. Cognitive Psychology, 4, 55±81.
Cohen, M. S., Freeman, J., Wolf, S., & Militello, L. (1995). Training metacognitive skills in Naval combat
decision making. Arlington, VA: Cognitive Technologies, Inc.
Dipboye, R. L. (1997). Organizational barriers to implementing a rational model of training. In M. A.
Quinones, & A. Ehrenstein, Training for a rapidly changing workplace: applications of psychological
research (pp. 31±60). Washington, DC: American Psychological Association.
Dorner, D., & Schaub, H. (1994). Errors in planning and decision making and the nature of human
information processing. Applied Psychology: an International Review, 43 (4), 433±453.
Druckman, D., & Bjork, R. A. (1991). Modeling expertise. In D. Druckman, & R. Bjork, In the mind's
eye: enhancing human performance (pp. 57±79). Washington, DC: National Academy Press.
Dwyer, D. J., Fowlkes, J. E., Oser, R. L., Salas, E., & Lane, N. E. (1997). Team performance
measurement in distributed environments: TARGETS methodology. In M.Brannick, E. Salas, & C.
Prince, Team performance assessment and measurement: Theory, research and applications (pp. 137±153).
Mahwah, NJ: Erlbaum.
Dwyer, D. J., Oser, R. L., & Fowlkes, J. E. (1995). A case study of distributed training and training per-
formance. In Proceedings of the Human Factors and Ergonomics Society 39th annual meeting 4, 1316±
1320. Santa Monica, CA: Human Factors and Ergonomics Society.
Fowlkes, J. E., Dwyer, D. J., Oser, R. L., & Salas, E. (1999). Event-based approach to training (EBAT).
International Journal of Aviation Psychology (in press).
Gaugler, B. B., & Thornton, G. C. (1989). Number of assessment center dimensions as a determinant of
assessor accuracy. Journal of Applied Psychology, 74, 611±618.
Glaser, R. (1986). Training expert apprentices. In I. Goldstein, R. Gagne, R. Glaser, J. Royer, T. Shuell,
& D. Payne, Learning research laboratory: proposed research issues (AFHRL-TP-85-54). Brooks Air
Force Base, TX: Manpower and Personnel Division, Air Force Research Laboratory
Glaser, R. (1989). Expertise and learning: how do we think about instructional processes now that we
have discovered knowledge structure? In D. Klahr, & D. Kotosky, Complex information processing: the
impact of Herbert A. Simon (pp. 269±282). Hillsdale, NJ: Erlbaum.
Glaser, R., & Chi, M. T. (1988). Overview of expertise. In M. Chi, R. Glaser, & M. Farr, The nature of
expertise. Hillsdale, NJ: Erlbaum.
Glaser, R., Lesgold, A., & Gott, S (1991). Implications of cognitive psychology for measuring job per-
formance. In K. Wagdor, & B. Green, Performance assessment for the workplace II (pp. 1±26).
Washington, DC: National Academy Press.
Goldstein, I. L. (1993). Training in organizations (3rd ed). Paci®c Grove, CA: Brooks/Cole Publishing.
R.L. Oser et al. / Computers in Human Behavior 15 (1999) 441±462 461
Gualtieri, J. W., Parker, C., & Zaccaro, S. (1995). Group decision-making: an examination of decision
processes and performance. Paper presented at the Annual Meeting of the Society for Industrial and
Organizational Psychology, Orlando, FL.
Hackman, J. R. (1987). Design of work teams. In J. Lorsch, Handbook of organizational behavior. Engle-
wood Clis, NJ: Prentice-Hall.
Hirokawa, R. Y., & Johnston, D. D. (1989). Toward a general theory of group decision making: devel-
opment of an integrated model. Small Group Behavior, 20 (4), 500±523.
Holland, J. H. (1993). Adaptation in natural and arti®cial systems. Cambridge, MA: MIT Press.
Huber, G. P. (1980). Managerial decision making. Glenview, IL: Scott Foresman.
Johnston, J. H., Smith-Jentsch, K. A., & Cannon-Bowers, J. A. (1997). Performance measurement tools
for enhancing team decision-making training. In M. Brannick, E. Salas, & C. Prince, Assessment and
measurement of team performance: theory, research and applications. Mahwah, NJ: Erlbaum.
Kauman, S. (1995). At home in the universe: the search for the laws of self-organization and complexity.
New York: Oxford University Press.
Kerr, N. L. (1981). Social transition schemes: charting the groups road to agreement. Journal of Person-
ality and Social Psychology, 41 (4), 684±702.
Klein, G. A. (1989) Recognition-primed decisions. In W. Rouse, Advances in man-machine systems
research (pp. 47±92, Vol. 5). Greenwich, CT JAI Press.
Klein, G. A. (1995). A recognition-primed decision (RPD) model of rapid decision making. In G. Klein,
J. Orasanu, R. Calderwood, & C. Zsambok, Decision making in action: models and methods (pp. 138±
147). Norwood, NJ: Ablex.
Klimoski, R., & Brickner, M. (1987). Why do assessment centers work? The puzzle of assessment center
validity. Personnel Psychology, 40, 243±260.
Lederman, L. C. (1992). Debrie®ng: toward a systematic assessment of theory and practice. Simulation
and Gaming, 23 (2), 145±160.
Lipshitz, R. (1995). Converging themes in the study of decision making in realistic settings. In G. Klein,
J. Orasanu, R. Calderwood, & C. Zsambok, Decision making in action: models and methods (pp. 103±
137). Norwood, NJ: Ablex.
McKinnon, D. W. (1975). Assessment centers then and now. Assessment and Development, 2, 8±9.
Means, B., Salas, E., Crandell, B., & Jacobs, T. O. (1995). Training decision makers for the real world. In
G. Klein, J. Orasanu, R. Calderwood, & C. Zsambok, Decision making in action: models and methods
(pp. 306±326). Norwood, NJ: Ablex.
Meister, D. (1985). Behavioral analysis and measurement methods. New York: Wiley.
Meliza, L. L., Bessemer, D. W., & Hiller, J. H. (1994). Providing unit training feedback in the distributed
interactive simulation environment. In R. F. Holz, J. H. Hiller, & McFann, H. H., Determinants of
eective unit performance: research on measuring and managing unit training readiness (pp. 257±280).
Alexandria, VA: US Army Research Institute for the Behavioral and Social Sciences.
Mintzberg, H., Raisinghani, D., & Theoret, A. (1976). The structure of ``unstructured'' decision processes.
Administrative Science Quarterly, 21, 246±275.
Noble, D. F. (1989). Applications of a theory of cognition to situation assessment. Vienna, VA: Engineering
Research Associates.
Noble, D. F. (1995). A model to support development of situation assessment aids In G. Klein, J. Orasanu,
R. Calderwood, & C. Zsambok, Decision making in action: models and methods (pp. 287±305). Norwood,
NJ: Ablex.
Noble, D. F., Grosz, C., & Boehm-Davis, D. (1987). Rules, schema and decision making. Vienna, VA:
Engineering Research Associates.
O'Neil, H. F. Jr., Allred, K., & Baker, E. L. (1997). Review of workforce readiness theoretical frame-
works. In H. F. Jr. O'Neil, Workforce readiness: competencies and assessment. Mahwah, NJ: Lawrence
Erblaum.
Orasanu, J. (1990). Shared mental models and crew decision making (Technical Report No. 46). Princeton,
NJ: Princeton University, Cognitive Sciences Laboratory.
Orasanu, J., & Connolly, T. (1995). The reinvention of decision making In G. Klein, J. Orasanu,
R. Calderwood, & C. Zsambok, Decision making in action: models and methods. Norwood, NJ:
Ablex.
462 R.L. Oser et al. / Computers in Human Behavior 15 (1999) 441±462
Oser, R. L., Cannon-Bowers, J. A., Dwyer, D. J., & Miller, H. (1997). An event based approach for
training: enhancing the utility of joint service simulations. Paper presented at the 65th Military Opera-
tions Research Society Symposium. Quantico, VA: Oce of Naval Research.
Oser, R. L., Cannon-Bowers, J. A., Dwyer, D. J., & Salas, E. (1997). Establishing learning environment
for JSIMS: challenges and considerations. In Proceedings of the 19th interservice/industry training sys-
tems and education conference (pp. 141±155). Orlando, FL: National Security Industrial Association.
Patel, V. L., & Groen, G. L. (1991). The general and speci®c nature of medical expertise: a critical look. In
A. Ericsson, & J. Smith, Toward a general theory of expertise (pp. 93±125). Cambridge, MA: Cambridge
University Press.
Rankin, W. J., Gentner, F. C., & Crissey, M. J. (1995). After action review and debrie®ng methods:
technique and technology [CD-ROM]. In Proceedings of the 17th interservice/industry training systems
and education conference (pp. 252±261). Orlando, FL: National Security Industrial Association.
Rasmussen, J. (1983). Skills, rules, knowledge, signals, signs, symbols and other distinctions in human
performance modeling. IEEE Transactions on Systems, Man and Cybernetics, 13 (3), 257±267.
Salas, E., & Cannon-Bowers, J. A. (1997). Methods, tools and techniques for team training. In M. Qui-
nines, & A. Ehrenstein, Training for a rapidly changing workplace: applications of psychological research
(pp. 249±279). Washington, DC: APA Press.
Schneider, W. (1985). Training high performance skills: fallacies and guidelines. Human Factors, 30, 539±
566.
Schvaneveldt, R., Durso, F., Goldsmith, T., Breen, T., Cooke, N., Tucker, R., & DeMaio, J. (1985).
Measuring the structure of expertise. International Journal of Man±Machine Studies, 23, 699±728.
Stasser, G., & Davis, J. H. (1981). Group decision making and social in¯uence: a social interaction
sequence model. Psychological Review, 88 (6), 523±551.
Vroom, V., & Yetton, P. W. (1973). Leadership and decision-making. Pittsburgh, PA: University of Pitts-
burgh Press.
Whitmore, P. G. (1981). The ``whys'' and ``hows'' of modern instructional technology. National Society
for Performance and Instruction Journal, 9±13.
Wickens, C. D. (1984). Engineering psychology and human performance. Columbus, OH: Merrill.
Wong, M. R., & Raulerson, J. D. (1974). A guide to systematic instructional design. Englewood Clis, NJ:
Educational Technology Publications.
Woods, D. D. (1988). Coping with complexity: the psychology of human behavior in complex systems. In
L. Goodstein, H. Anderson, & S. Olsen, Tasks, errors and mental models. London: Taylor and Francis.
Zakay, D., & Wooler, S. (1984). Time pressure, training, and decision eectiveness. Ergonomic, 27, 273±
284.