1990 - Pauchant e Mitroff

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

TECHNOLOGICAL FORECASTING AND SOCIAL CHANGE 38, 117-134 (1990)

Crisis Management
Managing Paradox in a Chaotic World

THIERRY C. PAUCHANT and IAN I. MITROFF

ABSTRACT

This paper extends the recent work of the authors in the field of crisis management (CM) [ 11. We explore
two related phenomena that compromise seriously the effectiveness of CM plans and procedures: (a) vicious
circles that are the result of unexamined and unintended human interventions in complex systems and (b) the
“contextual nature” of complex sociotechnical systems themselves. Major crises threaten both the “structural”
and the “affective” domains of complex systems. Two general strategies for coping with complex crises are
proposed. Each strategy not only recognizes the existence of a different domain but proposes a method for
treating them.

Introduction
In the past five years especially,
the field of crisis management (CM) has undergone
a marked and rapid growth. Recent data indicate that about 50% of large U.S. companies
have started some program of CM [ 1, 21. In the field of management, a number of books
have been published recently on the subject [2-IO]. In addition, the number of articles
dealing with CM has skyrocketed. Of the 281 articles published until mid-l 988, 90%
were published in the 198Os, and 60% since 1986 [Ill.
In spite of, or perhaps even because of these efforts, a fundamental controversy
exists. While some authors stress that foresight and planning are key to a successful
program of CM [2, 8, 121, others question their effects. For example, Quarantelli [ 131
has argued that planning per se does not necessarily guarantee successful CM in and of
itself; Khandwalla [14] has described some of the negative effects that result from CM
planning; Lagadec [lo] and Denis [15] have discussed the challenges to planning that
arise from the management of major technological accidents; Weick [16] has stressed
that actions undertaken to alleviate crises often precede our total understanding of the
situation and thus may actually exacerbate it; Mitroff [ 171 has argued that some complex
problems such as those we find in CM may have no final, definitive answer or solution
whatsoever; Shrivastava and his colleagues [18] have argued that planning efforts for
crises often do not eliminate their underlying causes. Futhermore, based on responses

THIERRY C. PAUCHANT is Assistant Professor of Strategic Management at the School of Business


Administration, Lava1 University, Quebec City, Canada, and a -esearch scientist at the Center for Crisis
Management, University of Southern California, Los Angeles. IAN I. MITROFF is the Harold Quinton Dis-
tinguished Professor of Business Policy and Director of the Center for Crisis Management, Graduate School
of Business, University of Southern California, Los Angeles.
Address reprint requests to Professor Thierry C. Pauchant, School of Administrative Sciences, Lava1
University, Quebec, Quebec, GIK 7P4, Canada.

0 1990 by Elsevier Science Publishing Co., Inc. 0040-1625/90/$3.50


118 T.C. PAUCHANT AND 1.1. MITROFF

from 114 of the Fortune 1000 companies, Pauchant [19, p. 1111 has suggested that
corporate efforts in the area of CM do not currently seem to have a significant effect on
the reduction of crisis. Indeed, a positive correlation has been observed between the use
of CM planning efforts and the number of crises reported by the responding companies.
If CM planning was effective, one would have expected to find a negative correlation,
i.e., that greater planning resulted in a lower frequency of crises.’
Such pessimistic views of CM could be explained by two phenomena. First, a number
of authors have commented on the “side effects” of human interventions in social systems.
These negative effects have been called “iatrogenic” in the medical sciences, “ultra-
solutions” in communication theory, and “vicious circles” in cybernetics [20-221. In such
cases, the effects of the interventions are actually seen to worsen the situation even though
they were originally intended to better it. It is thus not enough to speak in terms of
“problem shifting” rather than “problem solving,” i.e., transferring an initial problem in
one area to another [23]. In some cases, it is more appropriate to speak in terms of
“problem spreading,” because efforts, however well intended they may be, actually expand
the scope or the degree of the problem, thus worsening the situation from a global or
systemic perspective. Second, social systems often exhibit “counterintuitive behavior”
[24]. Echoing Levi-Strauss [25] and Durkheim [26], who argued that a social system has
a dynamics of its own, a number of authors have stressed that crises are part of the
“normal behavior” of complex systems and are resistant to traditional planning efforts
[6, 27, 281.
The potential presence and applicability of the two phenomena mentioned in the
previous paragraph, particularly in the area of CM, led us to supplement traditional view
of planning through the introduction of two additional generic strategies if CM is to prove
successful in reducing the occurrence of further crises and their impacts. These additional
efforts are intended to reduce (a) the likelihood that human interventions will generate
vicious circles and (b) the dynamics of the system itself. In this paper, we describe these
two phenomena in some detail, applying them to the particular case of the Bhopal disaster.
We argue that future efforts in CM need to adopt both strategies if they are to stand any
chance of proving effective.

Vicious Circles as Side Effects


Organizational systems have been defined as networks of individual actions inter-
acting with each other, to create loops of actions [21, 29-321. “Feedback loops” are
action loops that sustain or destabilize the equilibrium of a system. Such feedback loops
have been called “vicious circles” or [20] or “deviation-amplifying loops” [33] when they
are not regulated, i.e., when they grow without bound, thus leading to chaos and eventually
to the destruction of the system itself. In this case, once a variable in a loop continues
moving in one direction, either positively or negatively, it will do so until the system
collapses [34]. Merton’s example of “self-fulfilling prophecies” [35] is a classic example
of the vicious circle in the area of social behavior. While analyzing a banking system
during a crisis, he observed that the withdrawal of money from the bank by nervous
clients further endangered the liquidity of the bank, influencing more clients to withdraw
money, leading the bank into bankruptcy. More recently, Hall [36] and Masuch [37]
have provided other examples of vicious circles in organizations.

‘One explanation for the positive correlation that was observed is that for the most part crisis programs
are new; they have been initiated largely after the experience of a series of crises and thus they have not been
in place long enough to decrease both the rate and the frequency of future crises.
CRISIS MANAGEMENT 119

ProdUct
Defects
+
F
Problems In
sociotechnical
system

Fig. 1. Original problem.

Vicious circles are thus introduced through human interventions. In this sense, they
could be viewed as “ultrasolutions,” i.e., the implementation of solutions that negate the
original well-intentioned purpose of the action, as expressed in the medical dictum, “the
operation was successful, but the patient died” [22, p. 91. NASA provides a more recent
example: in order to lower product defects on the space shuttle, NASA developed a bonus
program designed to encourage employees to report defects. Noticing an increase in the
number of product defects, NASA management suspected later that some employees had
tampered with products in order to collect the bonus. NASA management called on the
FBI to investigate the case (The New York Times, August 26, 1988). In this example we
can witness the operation of various vicious circles. Efforts originally designed and
implemented to produce less product defects led in fact to greater defects, i.e., and ultra-
solution.’ We should also stress that NASA’s efforts to “solve” the defect problem have
not only “shifted” the problem to another area; they also have potentially “spread” the
problem, as indicated in Figures 1 and 2. In effect, what started as a sociotechnical
problem related to product defects spread to an issue of (suspected) overt tampering and

Tampering

/-+[s

+
J
Problems in Y
sociotechnical Bonus
system Problem ?

Fig. 2. Problems after spreading.

‘Part of the point is that no action, however well intentioned, is good in and of itself without considering
other variables in the whole system, e.g., the culture of organization, so that seemingly good or well intentioned
“solutions” or actions will not produce perverse effects.
120 T.C. PAUCHANT AND 1.1. MITROFF

sabotage and further to the issue of the suspicion of employees. At this writing, it is
difficult to evaluate if NASA’s dual interventions will have a positive impact on “solving”
the product defect problem.

Systemic Vicious Circles


Vicious circles can certainly be introduced by human interventions, but some have
argued that they are an inherent characteristic of complex social systems whose behavior
is not readily influenced by human actions [25, 261. This assertion is in apparent con-
tradiction with the definition of a social system as a system that is the result of human
actions. However, if social systems are seen as a complex interrelated network of only
partially predictable individual actions [ 29, 301, then social systems can develop a “dy-
namic” of their own. Bureaucracy is a case in point [ 38-401. When the actions of separate
individuals feed into the inherent dynamics of complex systems, then vicious circles can
not only arise, but they can necessitate further human interventions that can in turn
amplify further still the vicious circles set in motion, thereby formalizing or institution-
alizing the process of system response even more. Through the influence of such circles,
bureaucratic systems can become “stuck” or “frozen” [40]. In this sense, the system can
be seen as having developed a dynamic of its own, i.e., its “stuckness.” It should be
stressed that it is not so much the behavior of each individual variable or their intercon-
nections that lead a system to be “stuck.” Rather, it is their resultant properties as total
“field,” i.e., their total context.
Another description of the phenomenon has been provided by Emery and Trist [41]
through the concept of “turbulent fields.” Such processes not only arise “from the inter-
action of the systems, but also from the field itself’. Although different authors have thus
described the phenomena differently, e.g., as “interlocking behaviors” [21], “circular
relationships” [42], or “causal texture” [41] we prefer to describe them as “contextual
relations.” This term, which is borrowed from the “self-psychology” literature [43),
emphasized that because each variable is embedded in a force field, the context itself
may be more important for the system’s behavior than its individual variables and their
interrelations. It thus becomes useless to speak in terms of strict or pure “cause and
effect” between individual variables, as the variables themselves are “embedded’ with
each other in a total or systemic context.
A few authors in the field of CM and disaster research have taken note of the systemic
phenomenon we have been describing. For example, one of the editors of Nuclear Safety
has proposed that system failures are “the result of adding complexity in system design”
(Hagen, 1980, quoted in Perrow [6, p. 731. While this observation is still grounded in
a traditional view of cause and effect, the overall argument can, however, be seen in a
different perspective. From the perspective of embeddedness, the main problem is not
one of tracing and establishing cause-effect relationships between a complex network of
variables. Rather, it is a problem that arises because of the nature of the whole system
itself.
In his landmark study of high-risk technologies, Pet-row [6] has developed a similar
argument. In particular, he has insisted that the relation of a system’s complexity (e.g.,
the number of its nonlinear interactions) with the possibility of tight coupling between
components and subsystems increases the likelihood of accidents at the systems level.
For example, in the case of marine transport [6, pp. 170-2311, Perrow has argued that
purely technological solutions have not made a great difference in increasing safety.
Worse yet, it has in some cases increased the likelihood of an accident in the industry.
Similarly, in Per-row’s study of the U.S. space program [6, pp. 257-2921, he has stressed
CRISIS MANAGEMENT 121

that even the best talent, the most advanced technology, and the best resources could not
overcome the potential for an accident at the level of the whole system itself. In particular,
looking at a number of advanced technologies, Perrow has argued that the combination
of unexpected linkages between variables, limited prior knowledge of their potential
interactions, and the consequences of indirect and delayed information are associated
with the potential for large whole-system accidents, which in turn defeats further under-
standing and learning in the system itself:
We have produced designs so complicated that we cannot anticipate all the possible interaction of the
inevitable failures; we add safety devices that are deceived or avoided or defeated by hidden paths in the
systems In the past, designers could learn from the collapse of a medieval cathedral or the
collision of railroad trains But we seem to be unable to learn from chemical plant explosions or
nuclear plant accidents. We may have reached a plateau where our learning curve is nearly flat. [6, pp.
1I-121

Although Perrow’s analysis still uses a “cause-effect” language or view of the world
between individual variables, he also endorses the perspective that a system failure is
related to its overall complexity and tight coupling between variables, or what we have
called its overall “contextual relation.” The title of his book Normal Accidents reinforces
this view, signaling that “given the system characteristics, multiple and unexpected in-
teraction of failures are inevitable” [6, p. 51.

Accidents and Crises


To state that a circle is “vicious,” leading to a system’s breakdown depends upon
the viewer’s frame of reference [9, 37, 441. For example, if the Bhopal disaster has been
experienced as a tragic crisis by the local population, the Indian government, or Union
Carbide, it could also be seen in a positive light, having triggered new efforts directed
towards increased safety and prevention in the chemical industry as a whole [45]. Thus,
if circles can be seen as “vicious” by some stakeholders, they can also be seen as “virtuous”
by others [20]. In principle, different stakeholders can and will see complex phenomena
in very different ways, for crisis phenomena are inherently ill-structured [7].
For the same reason, it is no surprise to find that various definitions of crisis have
also been proposed in the literature. For example, Herman [46] has proposed that crises
should be viewed as threatening higher-priority goals, restricting response time and forcing
surprise on key decision makers. Fink et al. [47] have suggested that a crisis causes
and/or exposes the inadequacy of the repertoire of organizational coping responses. Other
authors have focused on the effects of a crisis in order to avoid all-encompassing or
“syndrome-like” definitions [48]. More fundamental still, others have argued that the
very labeling of an event as a “crisis” depends on the viewer’s basic perception [49, 501.
In this paper, we define a crisis as a disruption that either affects or has the potential
to affect a whole system, thus threatening the very core of its social identity (as explained
below and outlined in Figure 3). Following Perrow [6] in his discussion of turbulence,
we define an incident as a disruption of a unit or a subsystem of a larger system that
potentially affects the functioning of the total system. We define an accident as a disruption
affecting a system as a whole but not necessarily its core social identity. Also, following
Habermas [44] in his discussion on the quality of disturbance, we shall distinguish a
disruption occurring at the structural level, encompassing elements such as the technical
system and the rules and steering processes that are routinized, from one at the social
level, including elements such as the subjective and symbloic meanings informing and
directing action and assuring a sense of identity to the system. By means of this distinction,
we propose that the term social conjict applies when the social structure of a system is
122 T.C. PAUCHANT AND 1.1. MITROFF

SYSTEMS AREA

SUBSYSTEM WHOLE SUBSYSTEM

STRUCTURAL INCIDENT ACCIDENT

SYSTEMS
LEVEL

SOCIAL CONFLICT

Fig. 3. Definition of terms in CM.

disturbed, but not to the point of challenging its basic sense of identity. Thus, the key
element in the definition of a crisis is that it is a disruption that not only affects a system
as a whole, but also has a threatening effect on its basic social identity. In this case, the
disruption becomes an existential crisis for the system as a whole [ 19, 511, disturbing
its affective domain.3
Rare are the incidents that develop into accidents and degenerate further into crises.
For example, based on data provided by Public Citizen [52], it is suggested that out of
the 3,000 incidents encountered in 1987 by the U.S nuclear industry, only 430 or 14%
led to a temporary shutdown of the facility, thus potentially falling into the category of
“accidents.” Also, the nuclear industry in general has only encountered at the present
time “two” major crises: Three Mile Island and Chernobyl. Yet, this only introduces
another complication, for although the number “two” is small from a frequency standpoint,
it is “large” from a consequence standpoint of societal impacts, as, for instance, measured
in the increase of long-term deaths due to cancer.
Given these definitions, we can now examine an important case analysis of a crisis,
the Bhopal disaster. To document the importance of unsuspected vicious circles and the
inherent behavior of systems, we shall describe this case in some detail.

Contextual Relations Associated with the Bhopal Crisis


We have chosen this particular crisis for a couple of important reasons. It has been
studied and documented extensively. Also, it is considered the worst industrial crisis in
human history. To build this case analysis, we have drawn extensively on Shrivastava’s
detailed account on the subject [9]. We have also drawn on a number of published studies
[2, 16, 18, 28, 45, 50, 53-581. In addition, the 78 articles published on the subject in

sFor example, consider the poisoning of Tylenol. This caused the following crisis in the “existential”
sense. The basic purpose for the existence of Tylenol was to “do good” through the alleviation of pain. The
introduction of a foreign substance (cyanide) converted the drug from a vehicle of potential “good” to a vehicle
of “evil.” The point is that a crisis as we have defined it causes a “social flip” in the basic social purpose of
meaning of the product, plant, machine, etc.
1UCIL
-MCIUNIT1

Fig. 4. A sim$ified model of the contextual relations related.to the BhopaI accident.
124 T.C. PAUCHANT AND 1.1. MITROFF

The New York Times and The Wall Street Journal from January 1987 to February 1988
were examined in detail.
As we shall see, Bhopal first began as an “incident” at the unit level, quickly
degenerated into an “accident” that then affected the total system, finally turning into a
fullblown, multifaceted crisis. For this reason, we distinguish what we call the Bhopal
accident from the Bhopal crisis. To summarize this information, we present in Figure 4
a simplified model of the contextual relations associated with the accident, and in Figure
5. those associated with the crisis.

THE BHOPAL ACCIDENT


As argued previously, it would be simplistic to pinpoint a major “cause” or “causes”
for the accident. In Figure 4, we present a simplified model of the contextual relations
existing between nine central stakeholders or group of stakeholders prior to the Bhopal
accident.
In 1984, the world chemical industry was highly competitive. Although Union
Carbide was a very important company, in fact 37th largest in the United States, employing
a total of almost 100,000 employees with plants in 40 countries, the company had strong
pressures for increasing its profitability and reducing costs. For example, in 1984, the
firm realized an after-tax profit that was only about 60% of its major competitors. Also,
as part of a major strategic shift, in the late 1970s Union Carbide divested about 40
strategic business units and was considering selling its 50.9% controlling interest in Union
Carbide (India) Ltd. (UCIL).
Like its parent company, UCIL faced a number of difficulties. Although it was the
21 st largest company in India and employed more than 10,000 people, the plant at the
time of the accident operated at less than 50% of total capacity. Faced with factors such
as a highly competitive Indian pesticide market, which was itself in decline, and the
strong pressures of its parent company, UCIL tried to take advantage of economies of
scale. For example, it developed its production of MIC, a highly toxic gas used in the
fabrication of pesticides, even though the company lacked industrial expertise in this
area.
At the cultural level, the plant also encountered great turnaround of top manage-
ment, who were demotivated in part by the potential divestiture of UCIL. In addition,
many of the new managers lacked expertise in the chemical industry. Further still,
UCIL had to cut personnel, lowering even further the morale of the company and
compromising its security and emergency procedures. Meanwhile, the Indian govem-
ment had for a long time encouraged the large-scale use and the large-scale production
of pesticides as part of its efforts in the “green revolution” and in developing a stronger
national industrial basis. As such, the govemment was reluctant to place a heavy safety
burden on industries, fearing a decrease in additional job opportunities. The govem-
ment had also encouraged the industrial development of Bhopal and the surrounding
area. Literally overnight, Bhopal was transformed from a feudal and traditional com-
munity in 1959 to a large industrial city. Attracted by jobs, Bhopal’s population grew
from about 100,000 people in 1961 to 670,000 in 1981, a growth rate 300% higher
than the average growth rate in India. The exponential growth in population resulted
in a severe lack of housing facilities and a declining city infrastructure-poor water
supply, inadequate transportation, poor communication, poor education, health facili-
ties, etc. For example, at the time of the accident, there existed in Bhopal only 37
public telephones, 1,800 hospital beds and 300 doctors for a population of 670,000.
CRISIS MANAGEMENT 125
126 T.C. PAUCHANT AND 1.1. MITROFF

The severe housing shortage was also influential in the development of a number of
concentrated slums in the area, thousands of people literally living across the street
from the UCIL plant.
It is in this (simplified) context that the December 2, 1984, Bhopal accident emerged.
The initiating or precipitating incident was a leak of a highly toxic gas-MIC-that
originated in the UCIL MIC unit. This incident was itself contextually related with a
number of different factors and escalated into an accident. For the sake of simplification,
we shall mention here only five of the most important.
First, neither the Indian government nor UCIL’s top management or its employees
were knowledgeable about the potential dangers of MIC production. For example, prior
to the incident, an inadequate storage of MIC and other hazardous materials were allowed
to multiply, an emergency plan was nonexistent, and employees lacked proper training
and resources to face an emergency. Also, the Indian government had only 15 inspectors
for the 8,000 plants located in the province, and the two inspectors assigned to UCIL
had no training in chemical engineering. Second, a number of technical malfunctions and
human errors were confounded. For example, a number of safety valves in the MIC unit
were removed or did not operate during the incident; an emergency refrigeration unit was
under repair and thus not available for use; an audio emergency signal to warn the outside
population was turned off. Third, the immediate population of Bhopal was unaware that
they were in any danger; the general impression was that the plant was involved in the
production of “plant medicine.” Thousands of people living directly across the street from
the plant received the full impact of the toxic gas. Fourth, the local authorities, also
ignorant of the nature of the danger, commanded the population to flee from the city,
not knowing that a better strategy would have been to direct people to lie on the ground
and breathe through a humid cloth, thus alleviating the effects of the gas. Similarly, the
health community, ill-prepared for such an emergency, treated only symptoms, lacking
adequate equipment and resources to treat the full health consequences that developed.
Fifth, and last, the poor infrastructure of the city contributed in general to a relative lack
of effectiveness of emergency efforts.
All of these factors taken together created a total context that contributed to the
death of between 1,800 and 10,000 people and the injury of between 200,000 and 300,000,
depending on which sources of information are used. In addition, incalculable effects
were produced on animals and on the environment. It is impossible to pinpoint specifically
what “caused” the accident as well as what “caused” the death and the injuries of so
many victims. Certainly, the lack of commitment and interest of Union Carbide to UCIL
stands out; the general lack of knowledge and preparation of UCIL’s management and
employees; the technical failures and human errors; the push by the Indian government
to develop Bhopal as an industrial area with its resulting lack of infrastructure and safety
procedures; the fact that thousands of people were living directly across from the plant
and were not knowledgeable about its potential danger; the lack of knowledge and prep-
aration on the part of the local authorities, as well as the health specialists. However, if
these contextually related factors were associated with Bhopal as an accident, then another
set, adding to the complexity of the case of Bhopal were associated with Bhopal as a
crisis.

THE BHOPAL CRISIS


In Figure 5, we present a simiplified model of the contextual relations associated
with Bhopal as a crisis. The figure is organized around four key stakeholders: the Indian
government, Union Carbide, the local population, plus other various stakeholders. The
CRISIS MANAGEMENT 127

Bhopal crisis both emerged and was played out as the result of various “domains” in
which the stakeholders interacted.
In the first domain, a number of controversies developed between various stake-
holders. These included controversies as to the “exact nature” of the gas released, the
presence of cyanide-a very toxic gas used during war time-being suspected by some
and denied by others; the nature of the long-term effect of the injuries, fluctuating between
no effect and long-term and even intergenerational effects; and the exact death toll of the
accident, with the number of dead fluctuating between 1,800 and 10,000. These contro-
versies were themselves contextually related with a number of other issues. For example,
they were related to the type of treatment to be given to the victims, for instance, treatment
for MIC and cyanide being different from one another; to legal issues such as Union
Carbide’s financial responsibility, which depended in part on the number of dead and
injuries as well as its emergency planning procedures; for another, there was the difficulty
of assessing the Indian government’s role in the cause and the alleviation of the crisis.
These controversies were also related to the population’s general mistrust of both the
Indian government and Union Carbide, issues that were themselves related to thousands
of individuals fleeing from the city, which in turn contributed even further to severe
economical, social, and psychological difficulties.
In the legal domain, a number of lawsuits were entered: by the victims against the
Indian government and Union Carbide, by the Indian government against Union Carbide,
and by the firm’s shareholders against the company. The total dollar amount of these
suits was evaluated at between $350 million and $4 billion. In the financial domain,
Union Carbide’s stock was underevaluated for several months, and the firm had to defend
itself against a takeover attack, divesting 20% of its most valuable assets. The media
coverage of the crisis was also extensive, with continual news about the crisis, for example,
on the front page of The New York Times during a period of two weeks. The issues
debated in the coverage ranged from the ethical role of U.S. corporations in foreign
countries and in particular in the third world, to the safety of complex technologies, to
the legal issues related to compensation. These issues triggered in turn a number of threats
to the social identity of Union Carbide, the chemical industry in general, and U.S.
corporations as a whole operating in third world countries. As can be seen from Figure
5, this crisis also involved a number of other stakeholders such as different countries
with similar technologies, a number of activist groups, the U.S. Congress, and the
chemical industry as a whole.
As with the issues that are contextually related to Bhopal as an accident, the issues
related to Bhopal as a crisis are inordinately complex. They involve a web of financial,
legal, medical, technical, managerial, political, communicational, and therapeutic factors.
The domains of these factors are related to such diverse professional and academic fields
as sociology, psychology, management science, engineering, political science, law, eco-
nomics, anthropology, ethics, health science, religion, and theology. Since 1987, how-
ever, press coverage of the crisis has focused mostly on the issue of litigation. At this
writing, the Indian government has recently accepted $470 million in payment for damages
from Union Carbide after a legal battle that lasted more than four years, in which the
Indian government had sued the company for a total of $3.3 billion.

Vicious Circles
A great number of vicious circles can be seen in both models, i.e., Bhopal as an
accident and as a crisis. We shall mention only a few. For instance, a prominent vicious
circle involves decreased resources allocated to UCIL by Union Carbide, stemming from
128 T.C. PAUCHANT AND 1.1.MITROF’F

initial pressures for containing costs, with the resultant decreased motivation by UCIL
management and employees, contributing to the planned divestiture by it parent company.
These further fueled a decrease in a concern for safety and the resultant resources allocated
to it at UCIL, with the further reduction in resources leading to further reduction in
motivation and morale. Another contextually related vicious circle can be seen in the
Indian government’s encouragement of industrial development in Bhopal, further influ-
encing the government’s reluctance to impose strict safety standards, contributing to the
increase of production by UCIL and its attraction of new workers, further influencing
the Indian government to maintain its initial encouragement.
Also, a number of vicious circles were introduced through incomplete, faulty, or
poorly-thought-out CM interventions. For example, Union Carbide’s original desire to
divest UCIL not only applied pressure to lower its operating costs, but can be seen as
the result of a previous attempt by Union Carbide to avoid a crisis of legitimacy by its
shareholders. Thus, a proactive CM effort developed by Union Carbide, which focused
primarily on the financial domain, can be seen as having fueled the context of Bhopal
as a crisis. Similarly, the various strategies that were used by different stakeholders to
provide scientific analyses to validate their particular point of view, as in the case of
Union Carbide’s attempt to prove that the gas did not contain any cyanide while other
sources attempted to prove that it did, contributed to the spreading of further problems.
For example, the lack of reliable data fueled already-existing controversies between
different stakeholders, adding to the further decrease of trust. These in turn were associated
with further problems such as uncertainties as to which medical treatment to provide and
the difficulties the Indian government had in making financial compensations that were
equitable.

Toward a Systemic and Emphathic Program of Crisis Management


A number of observers have criticized Union Carbide with respect to issues such as
its lack of emergency plans at UCIL, the poor state of its emergency procedures, the
lack of proper operator training, the cuts in emergency staff, the limited resources allocated
to and devoted by UCIL to emergency procedures, and the low compensation scheme
offered by the firm [2, 9, 16, 53-55, 581, as well as its slowness in settling claims. While
these factors are important and deserve emphasis, they are not our principal focus in this
paper. Rather, having previously defined a crisis as a series of events that are related
centrally to the systemic and affective domains, we shall focus on two types of change
efforts directed toward these domains, i.e., systemic and empathic CM efforts, as indicated
in Figure 6. This dual effort is derived from the work of C. W. Churchman, who has
argued that the “systems approach” provides both a “judgment,” or a basis for a decision,
and a proper validating “mood” or an “attitude” to inform the actions that result from a
decision [59, p. 331.
Systemic efforts for CM are directed toward the decrease of vicious circles that
result from the very actions of human intervention themselves as well as those that are
related to the dynamics of the total systems context itself. In this sense, the question
asked by the decision maker is no longer “what could we do and have done to solve a
particular problem,” but rather “could we do or have done anything at all to diminish
the set of interrelated problems without spreading them further?”
For those vicious circles that are related to human interventions, efforts should be
directed to replace “ultrasolutions” by “systems-informed interventions” as indicated in
Figure 6. By this term, we wish to emphasize that decision makers should attempt to
acquire some knowledge of the contextual relations of the system under study and of the
CRISIS MANAGEMENT 129

NATURE OF VIBCIOUS CIRCLES

SIDE EFFECTS CONTEXTUAL

rOWAAD TOWARD
SYSTEMS-INFORMED HUMAN-SCALE
SYSTEMIC
NTERVENTION SYSTEMS

NATURE OF
CRISIS MANAGEMENT
EFFORT

TOWARD TOWARD
SYSTEMS-BASED NEGOTIATED
EMPATHIC ORDER
PURPOSE

Fig. 6. Reducing the context of crisis.

potential negative effects of their interventions. As shown in the context of Bhopal as an


accident, the financial decision by Union Carbide to divest itself of VCIL could be seen
as an ultra-solution. Based primarily on fmancial grounds, this type of solution did not
take into account the negative effects that a planned divestiture could have potentially
on the VCIL system as a whole, including its reliability and its safety as a productive
system. In this sense, the strategic decision trageted to “solve” a potential financial problem
contributed to the spreading of the problem to an issue of safety. A systemic view of the
problem would not have been guided by financial considerations only, but would have
added to the problem context a number of variables related to VCIL’s overall perfor-
mance, culture, and motivation, including its degree of reliablity. In this sense, the
strategic concept of “barriers to exit” 1601, oriented traditionally toward capital, cost,
and market considerations, needs to be enlarged to include variables influencing the
overall reliability and safety of a system as a paramount criteria.
Reducing vicious circles that are embedded in the total systems context itself presents
an even greater challenge. As we have suggested, interventions at this level are directed
toward the total systems context. Following Schumacher [61], we have labeled these
efforts as leading toward “human-scale systems” as indicated in Figure 6. The term “human
scale” is used there to signify that because the degree of complexity and tight coupling
of many advanced technologies currently poses a severe challenge to human understanding
and learning [6, 161 safety from a system’s perspective requires the decrease of contextual
complexity and the reduction of tight coupling to a level where it can be dealt with by
the cognitive limitations of humans.
Examples of unanticipated and nonanticipated events, high complexity, and tight
coupling that defeated emergency procedures and scientific understanding are numerous
in Bhopal as a disaster. We shall mention here four examples. First, the MCI-water
reaction was increased by small quantities of contaminants and impurities contained in
the tank that were not anticipated in emergency plans, triggering a number of secondary
chemical transformations that were themselves not anticipated. Second, particular safety
features, such as scrubber, could not operate, having been designed to act on gas alone,
but not on the unexpected compounded mixture of gas and liquid, Third, considering the
130 T.C. PAWHANT AND 1.1. MITROF’F

complex chemistry developed in the tank, a number of intermediate compounds were


released into the environment in addition to the MIC, making it impossible for scientists
to diagnose with certainty their content, thus decreasing the effectiveness of emergency
procedures. And fourth, even if the gas had not been contaminated with other components,
as argued, by Union Carbide, the company was unable to give precise information to the
emergency teams because an accident of this size with this particular gas had never
occurred before and was beyond scientific understanding.
Considering the resultant complexities and uncertanities, it becomes readily apparent
that intervention in such a system had the potential for creating a number of ultra-solutions.
For example, as stressed by Weick:

Our actions are always a little further along than is our understanding of those situations which means
we can intensify crises literally before we know what we are doing. [ 16, p. 3081

Stressing the inherent complexity of advanced technologies, different experts in the


field have warned of the inefficiency of technical or organizational improvements. They
have proposed instead to change the mode of production of the product or process, i.e.,
its overall context, or, if it is not possible, to discontinue its production [6]. As an example
of contextual changes in CM, Perrow [6] has proposed modifying the overall context of
advanced technologies. Similarly, in the organization development literature, Argyris and
Schon [62] have argued that to diminish the apathy of a bureaucratic system, it is better
to diminish its overall level of formalization, i.e., its context, rather than to try to intervene
through different human relations efforts, thereby only focusing on a limited set of
variables.
In the case of Bhopal, interventions on the context would have been different de-
pending on the definitions of the “system” and the perspective considered [50]. From the
perspective of Union Carbide, to develop efforts leading to systems that could be managed
would have led to questioning the overall efficiency of the large production of hazardous
products as well as their basic purpose. From the perspective of the Indian government,
to intervene at the level of the systems context would have meant to question the large-
scale production of pesticides, which was itself embedded in the large-scale effort of a
“green revolution.” In addition, it would have led to the questioning of large-scale and
rapid development of industrial complexes. It is likely that these questions would have
led to the confrontation of some fundamental political and strategic decisions made by
different stakeholders. Whatever the case, it is becoming clear that major crises are caused,
among many things, by overly simplified conceptions of and actions in complex systems.
As the latest large-scale disaster, Exxon Valdez, reminds us, the broadening of our
thinking is no longer a luxury but an absolute necessity.
While the previous systemic CM efforts were directed toward the conceptual di-
mension of crises, emphatic CM efforts are directed toward the affective dimension that
all crises raise. As we have seen previously, a crisis challenges the basic social identity
of a system, i.e., its basic affective or existential core [44, 511.
Although extremely difficult, empathy is one way for different stakeholders to un-
derstand and consider another’s perspective and purposes. It is well known that organi-
zations use different mechanisms to buffer problems or delay actions that are viewed as
threatening to their purposes [32, 63, 641. To develop empathic efforts in CM, leading
toward a “system-based purpose” as indicated in Figure 6, is to go beyond the focus on
a particular purpose. As it can be seen in the case of Bhopal, buffering strategies focusing
on a reduced set of purposes often developed into ultra-solutions. Three types of buffering
strategies are readily visible in the Bhopal crisis: first, before the crisis, Union Carbide
CRISIS MANAGEMENT 131

did not readily inform its shareholders of the potential dangers involved in its production;
second, neither Union Carbide nor the Indian government acted on the issue that thousands
of people were living across the street from the UCIL plant; and third, the controversies
that developed on the nature of the gas, on the long-term effects, and on the death toll
were never resolved. Although these buffering strategies could have served the immediate
purposes of individual stakeholders, focusing on a limited set of domains, all three have
been shown to have spread the problems further. In the first case, Union Carbide’s
shareholders sued the company after the accident for misrepresentation of safety, further
spreading the crisis to the legal domain; in the second case, thousands of people were
affected by the accident, accentuating its tragedy; in the third, the controversies contributed
to a lack of precise data necessary for focusing the emergency procedures and distributing
the compensation packages, also contributing to a further lack of trust between stake-
holders.
As stressed previously, the complexity and the tight coupling of sociotechnical
systems render obsolete traditional buffering strategies that have worked in the context
of simple and linear systems [ 1, 6, 641. For example, while Union Carbide did notify
the Indian government that it was risky to have thousands of people living directly across
the street from the UCIL plant, thus protecting its legal liability, the company did not
explore further the possibilities of resolving this issue. An empathic effort in the area of
CM would have not only considered the legal domain that was present in the issue, but
it would also have considered the issue of human dignity involved in the eventuality of
a major accident. From this perspective, different strategies could have been explored,
involving diverse stakeholders such as Union Carbide, UCIL, the Indian government,
the local government, and the local population, leading to a “systems-based purpose” In
this case, Union Carbide’s proactive CM action of protecting its legal liability if an
accident occurred could have been evaluated as a potential ultra-solution. Although this
particular issue would have been difficult to resolve, it seems that this empathic effort
had the potential of reducing one of the major vicious circles present in the context of
the Bhopal crisis, thus decreasing its effect if the accident took place.
Empathic efforts directed toward the context as a whole are even more challenging.
They are directed toward the resolution of a dual set of paradoxes present in CM at both
the micro and macro levels simultaneously. At the micro level, they are aimed at reducing
the paradoxical effect of denial present in the context of crisis for both individuals and
groups; at the macro level, they are directed at resolving some contextual paradoxes
present in larger systems such as at the organizational, national, and international levels.
The role and impact of denial in the context of crises has been well documented
[43, 65-701. Denial can have perverse effects prior to a crisis: individuals and groups
can deny the eventuality of a crisis or they can downplay its effects. After a crisis, they
can even deny its very occurrence or their responsibilities. The effect of denial is truly
paradoxical considering that on the one hand denial is one of the healthy mechanisms
through which human beings can assure their survival psychologically in face of dramatic
adversity [43, 651 allowing an individual or a group to avoid an identity crisis [44]; on
the other hand, however, denial enhances the likelihood of systemic accidents. In effect,
at the very moment when an individual needs most to embrace a systems perspective,
denial closes off this very option [ 191. An empathic CM effort would attempt to understand,
for affective reasons, the need for denial while at the same time trying to diminish, for
systemic reasons, its negative effects. The immense presence of denial mechanisms is
evident in the Bhopal crisis, prohibiting a system-based approach to CM. For example,
executives and officers in both Union Carbide and the Indian government stated prior to
132 T.C. PAUCHANT AND 1.1. MITROFF

the crisis that a disaster at the UCIL plant was impossible and would not happen. Fur-
thermore, Union Carbide did not act prior to the Bhopal disaster on several warning
signals from similar incidents that fortunately did not develop into systems accidents. An
empathic CM effort would have tried to move beyond denial and would have tried to
bring into being a systemic effort prior to the disaster.
At the macro level, an empathic effort in CM would attempt to reconcile the purposes
of different stakeholders even though they involve contradictions and paradoxes. Such
paradoxes include, for instance, the issues of cost efficiencies versus overall production
safety; the economic efficiency of large-scale production systems versus the accident
potential of such systems; the Indian government’s strategy of massive and rapid economic
development versus the necessity of limited growth, considering the limits of the overall
infrastructure. A number of authors have proposed different solutions to paradoxes at the
system level. For example, Trist [71] has called for the necessity of moving toward a
“negotiated order” in which each stakeholder learns to consider the purposes of other
stakeholders, considering that the control of environmental turbulence is beyond the
complete control of any single individual stakeholder. Similarly, a recent study conducted
by the United Nations’ World Commission on Environment and Development [72] has
stressed that the resolution of complex and global problems needs to involve a collaborative
effort between all concerned stakeholders and the integration of their different purposes.
We have borrowed Trist’s terminology of a “negotiated order” to typify these efforts, as
indicated in Figure 6. By this term we mean to emphasize that the reduction of the
complexity and the tight coupling existing in the contextual relations of a crisis such as
Bhopal necessitates the empathic understanding of each concerned stakeholder, including
their propensity for denial, leading toward a negotiated purpose.

Concluding Remarks
The field of CM is still in its infancy. Much more research is needed on the nature
of the contextual relations associated with crisis. Hoever, what seems clear at the present
moment is that CM does not exist for the sole purpose of returning to the prior status
quo or “business as usual.” While this reduced purpose can be seen as a healthy defense
mechanism for human systems to preserve their social identity, it needs to be transcended
considering the clear and very real potential for sociotechnical disasters.
In this paper we have mentioned some of the fundamental challenges offered by the
field of CM. These challenges include a questioning on the potential spreading effects
of decision makers’ interventions; a questioning of the human potential for misunder-
standing and losing control of complex sociotechnical systems; a questioning of the
systemic effectiveness of the pursuit of individual purposes; and a questioning of major
global strategic choices. Considering these challenges, it seems likely that the imple-
mentation of systemic and emphatic CM efforts would require a questioning of the
ideological context of current management thought and practice. We believe such ques-
tioning is long overdue. It should in fact cause an “identity crisis” in the field of man-
agement. The field of CM in particular and management in general needs--deserves-a
good crisis to shake it from its doldrums.

References
I. Mitroff, I. I., Pauchant. T. C., and Shrivastava,P., Conceptual and Empirical Issues in the Development
of a General Theory of Crisis Management, Technological Forecasting and Social Change 33, 83-107
(1988).
2. Fink, S.. B., Crisis Manapnent: Planning for the Inevitable, AMACOM, New York, 1986.
3. Turner, B. A., Man Made Disasters. Wykeham, London, 1978.
4. Quarantelli E. L.. ed, Disasters: Theoty and Research. Sage, Beverly Hills, Calif., 1978.
CRISIS MANAGEMENT 133

5. Smart, C F., and Vertinsky, I, eds., Studies in Crisis Management, Butterworth and Co., Toronto, 1978.
6. Perrnw, C., Norman Accidents: Living with High-Risk Technologies, Basic Books, New York, 1984.
7. Mitroff, 1. I., and Kilmann, R. H., Corporate Tragedies: Product Tampering, Sabotage, and Other
Catastrophes, Praeger, New York, 1984.
8. Meyers, G. C., When It Hits the Fan: Managing the Nine Crises of Business. Mentor Books, New York,
1986.
9. Shrivastava, P., Bhopal: Anatomy of a Crisis, Ballinger, Cambridge, Mass., 1987.
10. Lagadec, P., Etats d’ Drgence: Defaillance Technologiques et Destabilisation Sociale, Editions du Seuil,
Paris, 1988.
11. Pauchant, T. C., An Annotated Bibliography in Crisis Management. Research Laboratory, Graduate School
of Administrative Sciences, Lava1 University, Quebec City, Quebec, Canada.
12. Turner, B. A.. The Organizational and Interorganizational Development of Disasters, Administrative Sci-
ences Quarterly 2 1, 378-387 ( 1976).
13. Quarantelli, E. L., Disaster Crisis Management: A Summary of Research Findings, Journal ofManagement
Studies 25, 373-385 (1988).
14. Khandwalla, P. N., Crisis Responses of Competing Versus Noncompeting Organizations, Journal of
Business Administration 9(2), 15 I-178 (1978).
15. Denis, H., Le Risque Technologique Majeur. Monographie EPMRT-88/29. Ecole Polytechnique de Mon-
treal, Montreal, Canada, 1988.
16. Weick, K. E., Enacted Sensemaking in Crisis Situations, Journal of Management Studies 25(4), 305-317
(1988).
17. Mitroff, I. I., Teaching Corporate America to Think about Crisis Prevention, Journal of Business Strategy
6(4), 40-48 (1986).
18. Shrivastava, P., Mitroff, I. I., Miller, D., and Miglani, A., Understanding Industrial Crises, Journal of
Management Studies 25, 285-303 ( 1988).
19. Pauchant, T. C., Crisis Management and Narcissism: A Kohutian Perspective. Unpublished Ph.D. dis-
sertation, Graduate School of Business Administration, University of Southern California, Los Angeles.
20. Wender, P. H., Vicious and Virtuous Circles: The Role of Deviating-Amplifying Feedback in the Origin
and Perpetuation of Behavior, Psychiatry 3 1, 309-324 (1968).
21. Weick, K. E., The Social Psychology of Organizing, Random House, New York, 1979.
22. Watzlawick, P., Ultra-Solutions: How to Fail Most Successfully, W. W. Norton, 1988.
23. Linstone, H. A., Multiple Perspectives for Decision Making, North-Holland, New York, 1984.
24. Forrester, J. W., Counterintuitive Behavior of Social Systems, in Student Handbook for the Study of the
Future. H. F. Didsbury, Jr., ed., World Future Society, Washington, D.C., 1979, pp. 129-144.
25. Levi-Strauss, C., Antropologie Structurale. Plon, Paris, 1977.
26. Durkheim, E., The Rules of Sociological Method, Macmillan, London, 1982.
27. Jacob, J. P., and Sabclli, F., Entre Malheur et Catastrophe: Essai Antropologique Sur la Crise Comme
Representation, Crise et Chuchotements. J. Jacob, ed., Paris, 1984.
28. Marcus, A. A., Bhopal: Anatomy of a Crisis by Paul Sluivastava, Book review, Administrative Sciences
Quarterly 33(l), 154-157 (1988).
29. Weber, M., The Theory of Social Economic Organizations, T. Parsons, ed. Free Press, Glencoe, III.,
1947.
30. Parsons, T., Suggestions for a Sociological Approach to the Theory of Organizations, Administrative
Sciences Quarferly 1, 63-85 (1956).
31. Thompson, J. D., Organizations in Action, McGraw-Hill, New York, 1967.
32. Pfeffer, J., and Saslancik, G. R., The External Control of Organizations, Harper and Row, New York,
1978.
33. Maruyama, M., The Second Cybernetics: Deviation Amplifying Mutual Causal Processes, American Sci-
entist 51, 164-179 (1963).
34. Goldsmith, E., The Limits of Growth in Natural Systems, General Systems 16, 69-75 (1971).
35. Met-ton, R. K., Social Theory and Social Structure, Free Press, New York, 1957.
36. Hall, R. I., A System Pathology of an Organization: The Rise and Fall of the Old Saturday Evening Post,
Administrative Sciences Quarterly 2 1, 185-2 11 ( 1976).
37. Masuch, M., Vicious Circles in Organizations, Administrative Sciences Quarterly 30, 14-33 (1985).
38. Argyris, C., Personality and Organization, Harper and Row, New York, 1957.
39. March, J. G., and Simon, H. A., Organizations, Wiley, New York, 1958.
40. Crozier, M., Le Phenomene Bureaucratique, Seuil, Paris, 1963.
41. Emery, F. E., and Trist, E. L., Toward a Social Ecology: Contextual Appreciations of the Future in the
Present, Plenum Publishing Company, London, 1973.
42. Morgan, G., Images of Organization, Sage, Beverly Hills, Calif., 1986.
134 T.C. PAUCHANT AND 1.1. MITROFF

43. Kohut, H., The Restoration of Self International University, New York, 1977.
44. Habermas, J., Legitimation Crisis, Beacon Press, Boston, Mass., 1973.
45. Bowman, E., and Kunreuther, H., Post-Bhopal Behavior at a Chemical Company (1988), Journal of
Management Studies 25, 387-402 (1988).
46. Herman, C. F., Some Consequences of Crisis Which Limit the Viability of Organizations, Administra-
tive Science Quarterly 8, 61-82, 1963.
47. Fink, S. L., Beak, J., and Taddeo, K., Organizational Crisis and Change, JOU~M~ofApplied Behavioral
Sciences 7, 15-37 (1971).
48. Staw, B. M., Sandelans, L. E., and Dutton, J. E., Threat-Rigidity Effects in Organizational Behavior: A
Multiple Analysis, Adminisrrarive Sciences Quarrerly 26, 501-524 (1981).
49. Billings, R. S., Milbum, T. W., and Schaalman, M. L., A Model of Crisis Perception: A Theoretical and
Empirical Analysis, Administrative Sciences Qunrterly 25, 300-316 (1980).
50. Bowonder, B., and Linstone, H. A., Notes on the Bhopal Accident: Risk Analysis and Multiple Perspec-
tives, Technological Forecasting and Social Change 32. 183-202 (1987).
5 1. May, R., ed., Exisrenriol Psychology, Random House, New York, 1958.
52. Public Citizen, Report on U.S. Nuclear Safety. Washington, D.C., 1988.
53. Sehti, S. P., The Inhuman Error: Lessons from Bhopal, New Managemenr 3(l), 40-44 (1985).
54. Marcus, A., Bromiley, P., and Goodman, R., Preventing Corporate Crises: Stockmarket Losses as a
Deterrent to the Protection of Hazardous Products, Columbia Journul of World Business 22(l), 33-42
(1987).
55. Ayres, R. U., and Rohatgi, P. K., Bhopal: Lessons for Technological Decision-makers, Technology in
Sociefy 9, 19-45 (1987)
56. Lagadec, P., From Seveso to Mexico and Bhopal: Learning to Cope with Crises, Insuring and Managing
Hazardous Risks: From Seveso to Bhopal and Beyond. P. R. Kleindorfer and H. C. Kunreuther, eds.,
Springer-Verlag, New York, 1987.
57. Bowonder, B., The Bhopal Accident, Technological Forecusring and Social Change 32, 169-182 (1987).
58. Everest, L., Union Carbide’s Health Came First, The Gazette Feb. 24, B-3 (1989).
59. Churchman, C. W., The Systems Approach and Its Enemies, Basic Books, New York, 1979.
60. Porter, M. E., Competirive Srraregy: Techniques for Analyzing Indusrries and Competitors. Free Press,
New York, 1980.
61. Schumacher, E. F., Good Work, Harper and Row, New York, 1979.
62. Argyris, C., and Schon, D., Organizational Learning: A Theory of Action Perspecrive. Addison-Wesley,
Reading, Mass., 1978.
63. Maccoby, M., The Games-Man: Winning and Losing the Career Game, Bantam Books, New York, 1976.
64. Ackoff, R. L., Creating rhe Corporate Furure: Plan or Be PlannedFor. John Wiley and Sons, New York,
1981.
65. Bettelheim, B., Individual and Mass Behavior in Extreme Situations, Journal of Abnormal and Social
Psychology 38, 417-452 (1948).
66. Holsti, 0. R., Crisis, Stress and Decision Making, Inrernarional Social Science 23, 53-67 (1971).
67. Smart, C., and Vertinsky, I., Designs for Crisis Decision Units, Adminisrrurive Sciences Quarterly 22,
640-657 (1977).
68. Starbuck, W. H., Greve, A., and Hedberg, B. L., Responding to Crises, Journal OfBusinessAdminisrrarion
Spring, 11 I-137 (1978).
69. Schwartz, H. S., On the Psychodynamics of Organizational Disaster: The Case of The Space Shuttle
Challenger, The Columbia Journal of World Business 22(l), 59-68 (1987).
70. Pauchant, T. C., and Mitroff, I. I., Crisis Prone Versus Crisis Avoiding Organizations, Indusrrinl Crisis
Quarrerly 2(l), 53-63 (1988).
71. Trist, E., The Environment and Systems-Response Capability, Furures 12(2), 113-127 (1980).
72. World Commission on Environment and Development, Our Common Furure, Oxford University Press,
New York. 1987.

Received July 22, 1989

You might also like