Professional Documents
Culture Documents
Risk and Complexity-On Complex Risk Management.
Risk and Complexity-On Complex Risk Management.
Risk and Complexity-On Complex Risk Management.
https://www.emerald.com/insight/1526-5943.htm
1. Introduction
The insurance industry, and in particular the reinsurance industry, have learned the lessons
of complex risks harder than most. In their paper about managing complex risks, Sachs and
Wade (2013) from MunichRe discuss events that “[. . .] showed how quickly generally
accepted model assumptions and business practices can be overtaken by the complexity of
reality”.
The first event is the 9/11 terrorist onslaught of the World Trade Center in New York in
2001. This event they describe as a watershed event for the reevaluation of risk. Not only The Journal of Risk Finance
was the scale of the losses surprising – US$32bn – but the aftermath led to insured losses in Vol. 21 No. 1, 2020
pp. 37-54
almost all classes of business. Furthermore, over the past 10 years, only 30 per cent of © Emerald Publishing Limited
1526-5943
catastrophe losses were covered while the balance of US$1.3tn was borne by individuals, DOI 10.1108/JRF-09-2019-0165
JRF corporations and governments[1]. Furthermore, The Economist (2019a) states that the
21,1 central reason for this cautiousness is that the industry is afraid of regulators who may
punish them for taking on bad risks.
The second event is the subprime financial crisis of 2008, which clearly illustrated the
risks of interdependencies in a global world. As a classic result found in complex systems,
which will be discussed later, a set of factors – each being significant, but not disastrous –
38 interplayed producing a combined effect that far superseded what traditional risk
assessments would indicate. These events are what Taleb (2007) labels as Black Swans, or
high-impact, low probability (HILP) events in a more technical parlance. These events, and
many more, has led The World Economic Forum (2016) and others to comment on the “[. . .]
increasing volatility, complexity and ambiguity of the world”.
While the literature is scant on managing complex risks, there is some abstract-, high-
level- but impractical advice offered, as Sachs and Wade (2013) lament. Advice such as
“simplify,” “prepare for the next surprise” or “think the unthinkable” is arguably difficult to
operationalize. An exception is Nason (2017) who discusses complexity and its impact on
risk management. He rightfully argues that the holistic approach of enterprise risk
management (ERM) is an improvement over the more traditional, functional-oriented
approaches. ERM is discussed more, later.
One area where there has been some movement concerns the so-called environmental,
social and governance (ESG) risks, which are arguably complex in nature. Despite over
US$3tn in institutional assets track ESG scores, these scores suffer from inconsistencies and
incomparability to the extent that Hester Pierce of the US Securities and Exchange
Commission (SEC) described it as “Labelling based on incomplete information, public
shaming, and shunning wrapped in moral rhetoric,” according to The Economist (2019b).
Hence, finding a reliable way of handling complex risks is growing in importance, not only
to prevent disasters but also in regular investing.
Therefore, in this paper, the current state-of-the-art method is reviewed. Because the core
of the challenge lies in understanding complexity, the relationship between risks and
complexity is the main focus. Until this is properly understood, it is unlikely that any useful
improvements in risk management tools and approaches can be developed.
Before continuing, it should be noted that using the term complex risk management
(CRM) does not come without possible confusion. The confusion arises because a complex
management of risk is not the same as managing complex risks – indeed, it is the very
antithesis. Complex management is easy to achieve, but it is risky. Indeed, “Complex
systems almost always fail in complex ways,” as the Board that investigated the loss of the
Columbia space shuttle noted by the National Commission on the BP Deepwater Horizon Oil
Spill and Offshore Drilling (2011). After conducting a thorough review of the Deepwater
Horizon disaster, the commissioning concluded that:
The blowout was not the product of a series of aberrational decisions made by rogue industry or
government officials that could not have been anticipated or expected to occur again.
The report points to the fact that “[. . .] the root causes are systemic and, absent significant
reform in both industry practices and government policies, might well recur.” What we need
is effective management of complex risks, labeled here as CRM.
So much for finance, is the state-of-the-art any better in other domains such as engineering?
Unfortunately, it is not as argued by Emblemsvåg (2008). Repeatedly we can read about
cases where professionals have greatly underestimated probabilities and/or impact. Then
we have the problems associated with qualitative approaches as discussed by Emblemsvåg
and Endre Kjølstad (2002), and exemplified by Backlund and Hannu (2002), who provide a
telling case where three different consulting companies were hired to assess the risks of the
same object and reaching three very different conclusions. There are some remedies to this
as discussed by Emblemsvåg (2010); however, the case of complex risks remains unsolved.
The fact is that all the traditional risk management approaches – qualitative,
quantitative/probabilistic and statistical alike – were derived from a linear conception of
cause and effect. Here, risk is merely the combination of probability and impact as found in
early guidelines such as the 1992 COSO report in the USA (COSO, 1992) and the 1999
Turnbull Report in the UK (ICAEW, 1999) and in guidelines to this very day.
Building on the 1992 COSO report, however, COSO introduced in 2004 their ERM
framework, with the underlying philosophy that:
JRF [. . .] value is maximized when management sets strategy and objectives to strike an optimal
balance between growth and return goals and related risks, and efficiently and effectively deploys
21,1 resources in pursuit of the entity’s objectives (COSO, 2004a).
This is perhaps one of the first frameworks that try to take a holistic view of risk and
therefore moves itself away from the simple, linear conception of risk to one more aligned
with the complex nature of many risks. Despite this framework has been widely recognized
40 as a respected authority on ERM, as Pierce and Goldstein (2016) and others claim, COSO
found in their 2010 survey that corporations are still immature concerning ERM (Beasley
et al., 2010). In 2017, COSO (2017) issued an amended framework in response to the
increasing complexity of risks more clearly connecting ERM with strategy and anchoring it
to the context of a corporation’s performance.
The Turnbull report is another ERM framework made at the request of the London Stock
Exchange (LSE), and it became a mandatory requirement for all corporations listed at the
LSE before 2001, Jones and Sutherland (1999) write. These guidelines were critical events in
the standardization and reconceptualization of internal control as risk management,
according to Power (2007). The interested reader is referred to Power (2007) who provides an
excellent overview and discussion of COSO, Turnbull and more.
They have certainly resulted in improvements because they take a holistic view
promoting a corporate-wide approach to risk (Nason, 2017), but given that all the
aforementioned disasters in this paper took place well after these publications, it is clear that
there is much to amend. One reason is that these guidelines and frameworks are of high level
and rely on traditional ways of assessing risk (COSO, 2004b), which have the shortcomings
discussed earlier.
This means that there exists currently no approach that can estimate or manage complex
risks reliably well – it is much more art than science. As Smith and Irwin (2006) note:
[. . .] in dealing with the implications of complex systems, risk management will need to move
beyond a linear, cause and effect approach [. . .] [. . .] incorporating the complexity literature in the
process.
Hence, a first step in our quest to amend the situation must be to review complexity and risk,
which is done next Section 3. Based on this discussion, it will become evident what is
theoretically possible and what is realistically achievable. Future research must be to find
the best fit between possibility and realism, as discussed in Section 4.
So far, we have looked at complexity solely as an inherent property of certain systems, but in
our daily lives, complexity also refers to a type of behavior that we can witness. Some complex
processes often exhibit simple behavior whereas many simple systems exhibit complex and
even chaotic behavior Ilachinski (1996). Recall, for example, the Mandelbrot fractal; it is very
simple by definition but exhibits extreme complex/chaotic patterns (Mandelbrot, 1980). A sail
boat in contrast is a complex system but usually exhibit fairly simple behavior.
Because complexity is both a property and a type of behavior, distinguishing between
the two may be confusing in our daily language. To reduce this confusion, Figure 1 attempts
to illustrate the main types of complexity. Complexity as a property refers to the strengths of
the couplings between the various elements that constitute the system, whereas complexity
as a behavior essentially refers to our degree of ignorance about the output from the system.
With this in mind, it becomes useful to define four types of systems where each type refers
to a certain type of behavior according to its inherent complexity.
The simplest case is complicated systems in which the strength of the couplings is so strong
that one might even consider the system as not complex at all – it is merely complicated. Such
systems can be analyzed very accurately and hence the degree of ignorance is very low. Such
systems are “ordered” because the behavior is predictable. A prototypical example is
mechanical clockwork in which there is a huge amount of tiny parts but all the couplings
between the parts are known accurately and the output becomes predictable.
The next type of system is complex systems, that is, systems where the couplings are
loose. Maneuvering a sailboat is an example of a complex activity. The reason is that
Risk and
complexity
43
Figure 1.
The four Cs of
complexity
numerous factors such as wind, waves, currents and angles interplay in a loose manner so
the result cannot be known accurately. However, the couplings are not as loose as in chaotic
systems, or systems of chaos. As a concept, chaos was born in 1892 with Henri Poincare’s
discovery that certain orbits of three or more interacting celestial bodies can exhibit
unstable and unpredictable behavior, but the term was coined in 1975 by Li and Yorke to
denote random output of the so-called “deterministic mappings” Ilachinski (1996).
A technical note is required because of the different usage of terms in the literature. Note
that chaos in the literature has a very technical interpretation that does not fit into
complexity theory – “technical” chaos derives from successive differential changes that are
deterministic whereas chaos in this paper refers to very loose couplings. Technical chaos is
therefore a special case of complicated systems, whereas chaos in Figure 1 refers more to
chaos as most people will intuitively describe it. Hence, the difference between complex
systems and chaotic systems in this paper is a matter of degree.
A significant finding in the literature is that chaotic systems in a “technical” sense tend to
be intrinsically sensitive to initial conditions of the system. In this paper, this means that
complicated systems can turn chaotic under certain circumstances. Another intriguing
finding noted by Ilachinski (1996) is that a complex system can turn chaotic if the system is
not allowed to relax between events.
Furthermore, as a rule of thumb, complex systems constitute many connected elements
but exhibit simple, stable and quite predictable behavior, while chaotic systems typically
consists of only a very limited number of elements but produce an unpredictable behavior in
the long run. In the short run, however, “Given sufficient data, time series analysis permits
one to make short-term predictions about a system’s behavior, even if the system is chaotic.
Moreover, these predictions can be made even when the underlying dynamics is not
known,” according to Ilachinski (1996). The success of The Weather Company is a
testimony to this – by using Big Data, it has become big business, according to The
Economist (2019a). Most corporations and social systems are complex because they consist
of a large amount of elements yet exhibit quite stable – albeit a somewhat uncertain –
behavior.
Note that “unpredictable” in Figure 1 refers to a state, or condition, in which the lower
level constituent knows an output will take place but have no idea about the type of output
and/or the magnitude. “Uncertain,” on the other hand, refers to a state for which we lack
information about the type of output and the magnitude, but we know an output will take
place and we have some ideas about the type and the magnitude. With these definitions of
uncertainty and unpredictability, we see that there is a major implication for the
JRF management of such systems. From the definition of Joia (2000) that “Information is data
21,1 with attributes or relevance and purpose,” we understand that although we may have data
from chaotic systems, their unpredictability renders them useless for management.
The last type of system – chance systems – is systems that produce output randomly, or by
chance. This occurs because the couplings are so weak – or simply do not exist – that they do not
produce any measurable effect on macro level. A good example is the static noise in radios. For
44 this paper, we therefore focus on complex and chaotic systems, while keeping in mind that both
complicated and chance systems occur in reality and may be necessary to consider at times.
It is crucial to realize that the properties of the systems discussed above are intrinsic
properties of the systems and do not arise because of lack of information. Lack of
information, beyond what is feasible, will further reduce our understanding of these systems
and make them harder to manage. Having as much information as possible is therefore
beneficial, but recall Zadeh’s Law of Incompatibility – there are limits to what can be
actually known regardless of the amount of data we have.
This brings us to complex adaptive systems. According to Ilachinski (1996), a complex
adaptive system is a “macroscopic collections of simple (and typically nonlinearly)
interacting units that are endowed with the ability to evolve and adapt to a changing
environment.” With reference to Figure 1, this means that complex adaptive systems are
essentially open, higher level complex systems, where “open” means that it adapts to the
environment. Coleman (1999) finds that all corporations are complex adaptive systems. For
simplicity, the distinction between complex systems and complex adaptive systems is not
made in this paper.
The properties of complex systems produce some “side effects” that are imperative to
understand in the context of risk management. These include the following:
Predictions are possible.
Long-term trends exist.
Linearity is rare.
Everything is connected.
Context is king.
Adaptation and coevolution is the norm.
Behavior is both emergent and resulting.
Obviously, the allotted space in a paper is far too little to discuss all this in detail. However, a
brief discussion of these “side effects” is provided in the next seven sections, but recall that
many of them interrelate. Furthermore, the whole system (macro state) impacts the
constituents of the system (micro states) and vice versa. Therefore, Ilachinski (1996) asserts
that in dealing with complex systems, we must always think of the whole as well as the
pieces – anything less is self-deception. Of course, thinking about everything is not possible
so judgment is called for in striking the right balance between holism and synthesis, on the
one hand, and reductionism and analysis, on the other hand.
Fixed point- and periodic attractors can serve as rough approximations for some phenomena
in social systems, such as the long cycles in the world economy, but they are rare in real life.
Therefore, strange attractors are the most interesting for this paper. There is nothing
strange per se about these attractors – they just happen to appear in situations we might
consider strange. For example, in chaotic systems such as the weather, we can identify a
behavior that repeats itself – albeit with variations each time. Long-term trends therefore
exist.
JRF Strange attractors also appear in systems that are not chaotic, but merely complex.
21,1 Usually, strange attractors look very geometrically complicated can take on incredible
shapes, such as the Mandelbrot fractal. However, the really interesting fact concerning this
paper is that they “they tend to be of very low dimensionality, even in a high-dimensional
phase space” Capra (1996), where phase space is understood as a mapping of all possible
states the system can have. This means that even though very complex systems consist of
46 many elements, their behavior typically revolve around long-term trends that are governed
by only a few variables.
A very good example of this is how the invention of containerization revitalized the
marine transportation industry – it reduced the port downtime from 12 days to 12 h
(Flaherty, 1999). Arguably, the whole industry was transformed because of one single
variable – the enormous cut in the unproductive downtime in ports. This story also
exemplifies that dramatic shifts can take place in the phase space. These points are
referred to as bifurcation points, and in real life, they can be occurring because of, for
example, major innovations or terrorist actions. These events are usually not driven by
what seems obvious at first sight, because small changes in variables, that the complex
system is sensitive to, can result in enormous changes. In other words, by carefully
understanding the social system, we can reduce the probability for unanticipated
changes but looking for “the next big thing” is futile. This is obvious because the next
big thing is likely to be initially small and often overlooked by those who concentrate
on the “big picture”.
Unfortunately, this means that trying to predict complex risk events over long term is
impossible because their roots are normally small things that we ignore. We can at best find
their drivers.
It has always been like this, but in an increasingly interconnected world, and therefore a
world with an increasing proliferation and effective feedback loops, our ability to imitate has
become much more effective. Consequently, the static situation of two athletes competing to
finish first occurs, but not as often as we are led to believe and definitively not as
independently as we may believe.
When corporations try to deliver innovative products first, the true winner is rarely
the one that finishes first in the sense of time. In fact, according to Allen Weiss, a
marketing professor at USC’s Marshall School of Business (Bird, 2001), studies show
that 47 per cent of all market pioneers fail and out of those that do not fail, only 11 per
cent maintain a market leader position several years later. Indeed, in some cases, there
is a latecomer advantage. In the steel industry [for example, Gerschenkron (1962)], a
latecomer has the advantage of not having to invest in less productive facilities and
obsolete technology, which pioneers must do to build the industry from scratch. The
latecomer can therefore leapfrog directly to the state-of-the-art in economically sized
facilities. This process has been confirmed by others later in other industries and
countries (Shin, 1995). In some cases, there are clear first-mover advantages, however.
What makes first-mover advantages real are the dynamics of the industry and the
possibility to protect Intellectual Property Rights (IPR). If the industry requires critical
mass of some sort and there are substantial switching costs, it can be advantageous to
be a first-mover provided that the critical mass is obtained and switching costs persist,
such as Google and Amazon.
Adaptation and coevolution can also be fostered within corporations. Agile
corporations typically “[. . .] move from economies of scale to economies of scope”
(Sharifi and Zhang, 1999). Toyota, for example, has done this for years; “[. . .] Risk and
workplaces are surrounded by charts showing how their output compares with that of complexity
other plants and competitors” (Maguire and McKelvey, 1999). This approach is
classified as adaptive tension, but all adaptation, coevolution and learning are full of
tension. In fact, Edgar Schein views the whole process of learning as filled with “guilt
and anxiety,” “fundamentally coercive” and “[. . .] that learning only happens when
survival anxiety is greater than learning anxiety” (Coutu, 2002). The logical 49
consequences of this are many, including the popular notion of learning being fun – it is
fundamentally flawed.
The purpose here is not to dispel the idea of competition, but to put it in a more correct
context. Head-on competition occurs, but adaptation and coevolution is the norm. The major
lesson for CRM is that we adapt and coevolve with the risks, making them even harder to
identify.
Note
1. According to Swiss Re Institute, Sigma 2/2019.
51
References
Admati, A.R. and Hellwig, M. (2013), The Bankers’ New Clothes: What’s Wrong with Banking and What
to Do about It, Princeton University Press, Princeton, NJ.
Backlund, F. and Hannu, J. (2002), “Can we make maintenance decisions on risk analysis results?”,
Journal of Quality in Maintenance Engineering, Vol. 8 No. 1, pp. 77-91.
Beasley, M.S., Branson, B.C. and Hancock, B.V. (2010), COSO’s 2010 Report on ERM: Current State of
Enterprise Risk Oversight and Market Perceptions of COSO’s ERM Framework, NC State
University, Raleigh, NC.
Begun, J.W. (1994), “Chaos and complexity: frontiers of organization science”, Journal of Management
Inquiry, Vol. 3 No. 4, pp. 329-335.
Bird, J.B. (2001), “First-mover advantage: myth or reality?”, The McCombs School of Business
Magazine, The McCombs School of Business at the University of Texas at Austin, Austin,
TX.
Bonabeau, E. (2004), “The perils of the imitation age”, Harvard Business Review, Vol. 82 No. 6,
pp. 45-54.
Callon, M. and Latour, B. (1981), “Unscrewing the big leviathan”, in Knorr-Cetina, K.D. and Mulkay, M. (Eds),
Advances in Social Theory and Methodology, Routledge and Kegan Paul, London, pp. 275-303.
Capra, F. (1996), “The web of life”, First Anchor Books Edition ed, Anchor Books, Doubleday,
New York, NY.
Chatzkel, J. (2001), “A conversation with Jonathan low”, Journal of Intellectual Capital, Vol. 2 No. 2,
pp. 136-147.
Coleman, H.J. Jr. (1999), “What enables self-organizing behavior in business”, Emergence, Vol. 1 No. 1,
pp. 33-48.
COSO (1992), “Internal control – integrated framework (2 volumes)”, available at: www.aicpastore.com:
Committee of Sponsoring Organizations of the Treadway Commission (COSO).
COSO (2004a), “COSO enterprise risk Management – integrated framework”, available at: www.
aicpastore.com, Committee of Sponsoring Organizations of the Treadway Commission (COSO).
COSO (2004b), “COSO enterprise risk management – integrated framework: application techniques”,
available at: www.aicpastore.com, Committee of Sponsoring Organizations of the Treadway
Commission (COSO).
COSO (2017), “COSO enterprise risk management – integrating with strategy and performance”,
available at: www.aicpastore.com, Committee of Sponsoring Organizations of the Treadway
Commission (COSO).
Coutu, D.L. (2002), “Edgar H. schein: the anxiety of learning”, Harvard Business Review, Vol. 80 No. 3,
pp. 100-106.
Davenport, T.H., Delong, D.W. and Beers, M.D. (1998), “Successful knowledge management projects”,
Sloan Management Review, Vol. 39 No. 2, pp. 43-57.
De Geus, A.P. (1988), “Planning as learning”, Harvard Business Review, Vol. 66, pp. 70-74.
JRF Dent, E.B. (1999), “Complexity science: a worldview shift”, Emergence, Vol. 1 No. 4, pp. 5-19.
21,1 Emblemsvåg, J. (2001), “Activity-based life-cycle costing”, Managerial Auditing Journal, Vol. 16 No. 1,
pp. 17 -27.
Emblemsvåg, J. (2003), Life-Cycle Costing: Using Activity-Based Costing and Monte Carlo Methods to
Manage Future Costs and Risks, John Wiley and Sons, Hoboken, NJ.
Emblemsvåg, J. (2008), “On probability in risk analysis of natural disasters”, Disaster Prevention and
52 Management: An International Journal, Vol. 17 No. 4, pp. 508-518.
Emblemsvåg, J. (2010), “The augmented subjective risk management process”, Management Decision,
Vol. 48 No. 2, pp. 248-259.
Emblemsvåg, J. and Endre Kjølstad, L. (2002), “Strategic risk analysis – a field version”, Management
Decision, Vol. 40 No. 9, pp. 842-852.
Emblemsvåg, J. and Bras, B. (1998), “Financial analysis, critical assumption planning and uncertainty
in product development”, 1998 International Conference on Achieving Excellence in New Product
Development and Management, Atlanta, GA.
Flaherty, J.E. (1999), Peter Drucker: Shaping the Managerial Mind, Jossey-Bass, San Francisco.
Gerschenkron, A. (1962), Economic Backwardness in Historical Perspective, Harvard University Press,
Cambridge, MA.
Goldin, I. and Kutarna, C. (2017), “Risk and complexity”, Finance and Development, pp. 46-49.
Goldstein, J. (1999), “Emergence as a construct: history and issues”, Emergence, Vol. 1 No. 1, pp. 49-72.
Graham, B. (2005), The Intelligent Investor, HarperCollins Publishers, New York, NY.
Graham, B. and Dodd, D.L. (1951), Security Analysis: Principles and Technique, 3rd ed., McGraw-Hill
Book Company, London.
Hagstrom, R.G. (2005), The Warren Buffett Way, John Wiley and Sons, Hoboken, NJ.
Hodgkinson, G.P., Bown, N.J., Maule, A.J., Glaister, K.W. and Perman, A.D. (1999), “Breaking the frame:
an analysis of strategic cognition and decision making under uncertainty”, Strategic
Management Journal, Vol. 20 No. 10, pp. 977-985.
Horgan, J. (1995), “From complexity to perplexity”, Scientific American, Vol. 272 No. 6, pp. 104-109.
ICAEW (1999), Internal Control: Guidance for Directors on the Combined Code (Turnbull Report),
Institute of Chartered Accountants in England and Wales (ICAEW), London.
Ilachinski, A. (1996), Land Warfare and Complexity, Part I: Mathematical Background and Source Book
(U), Center for Naval Analyses, Alexandria, VA.
James, D.N. (2002), “The trouble I’ve seen”, Harvard Business Review, Vol. 80 No. 3, pp. 42-49.
Jensen, M.C. (2001), “Corporate budgeting is broken – let’s fix it”, Harvard Business Review, Vol. 79
No. 10, pp. 94 -101.
Johnson, S. and Kwak, J. (2010), 13 Bankers: The Wall Street Takeover and the Next Financial
Meltdown, Pantheon E-books, New York, NY.
Joia, L.A. (2000), “Measuring intangible corporate assets: linking business strategy with intellectual
capital”, Journal of Intellectual Capital, Vol. 1 No. 1, pp. 68 -84.
Jones, M.E. and Sutherland, G. (1999), Implementing Turnbull: A Boardroom Briefing, The Center for
Business Performance, The Institute of Chartered Accountants in England and Wales (ICAEW),
City of London.
Kelly, K. (1995), “Out of control: the new biology of machines”, Social Systems, and the Economic World,
Perseus Books, New York, NY.
Kunreuther, H., Novemsky, N. and Kahneman, D. (2001), “Making low probabilities useful”, Journal of
Risk and Uncertainty, Vol. 23 No. 2, pp. 103-120.
Linneman, R. and Klein, H. (1983), “The use of multiple scenarios by US international companies - a
comparison study”, Long Range Planning, Vol. 16 No. 6, pp. 94-101.
Loren, G. (2003), Why Budgeting Kills Your Company, BetterManagement.com. Risk and
McNeill, D. and Freiberger, P. (1993), Fuzzy Logic, Simon and Schuster, New York, NY. complexity
Maguire, S. and McKelvey, W. (1999), “Complexity and management: moving from fad to firm
foundations”, Emergence, Vol. 1 No. 2, pp. 19-61.
Mandelbrot, B.B. (1980), “Fractal aspects of the iteration of z->lamda z(1-z) for complex lambda and z.
non linear dynamics”, Annals of the New York Academy of Sciences, Vol. 357 No. 1, pp. 249-259.
March, J.G. (1994), A Primer on Decision Making: How Decisions Happen, The Free Press, New York, 53
NY.
Nason, R. (2017), It’s Not Complicated: The Art and Science of Complexity in Business, University of
Toronto Press, Toronto.
National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling (2011), Deep Water:
The Gulf Oil Disaster and the Future of Offshore Drilling - Report to the President, United States
Government Printing Office, Washington, DC.
Pierce, E.M. and Goldstein, J. (2016), “Moving from enterprise risk management to strategic risk
management: examining the revised COSO ERM framework”, 14th Global Conference on
Business and Economics, Said School of Business, Oxford.
Pollitt, D. (2003), “How top executives can hit the ground running: making the most of the first 90 days”,
Human Resource Management International Digest, Vol. 11 No. 7, pp. 34-36.
Power, M. (2007), Organized Uncertainty: Designing a World of Risk Management, Oxford University
Press, Oxford.
Roy, R.K. (2001), Design of Experiments Using the Taguchi Approach: 16 Steps to Product and Process
Improvement, John Wiley and Sons, New York, NY.
Sachs, R. and Wade, M. (2013), Managing Complex Risks Successfully, Münchener Rückversicherings-
Gesellschaft (Munich Re), Munich.
Schwartz, P. (1997), The Art of the Long View, John Wiley and Sons, New York, NY.
Sharifi, H. and Zhang, Z. (1999), “A methodology for achieving agility in manufacturing
organizations: an introduction”, International Journal of Production Economics, Vol. 62
Nos 1/2, pp. 7-22.
Shin, J.-S. (1995), Catching up, Technology Transfer and Institutions: A Gerschenkronian Study of Late
Industrialization from the Experience of Germany, Japan and South Korea with Special Reference
to the Iron and Steel Industry and the Semi-Conductor Industry, Darwin College, Cambridge
University.
Smith, E.B. (2004), What Puts the CFO in the Middle of Scandals?, USA TODAY, p. 5.
Smith, D. and Irwin, A. (2006), “Complexity, risk and emergence: elements of a ‘management’ dilemma”,
Risk Management, Vol. 8 No. 4, pp. 221-226.
Taleb, N.N. (2007), The Black Swan: The Impact of the Highly Improbable, Allen Lane, London.
The Economist (2001), “Chief executives: churning at the top”, The Economist, Vol. 358 No. 8213,
pp. 75-77.
The Economist (2003), Tough at the Top: A Survey of Corporate Leadership, The Economist, London.
The Economist (2004), Living Dangerously: A Survey of Risk, The Economist, London.
The Economist (2018), “Briefing; the financial crisis”, The Economist, Vol. 428 No. 9108, pp. 19-21.
The Economist (2019a), “Run for cover; innovation in insurance”, The Economist, Vol. 432 No. 9152,
pp. 55-56.
The Economist (2019b), “Poor scores; ESG investing”, The Economist, Vol. 433 No. 9172, pp. 63-64.
The World Economic Forum (2016), The Global Risks Report 2016, The World Economic Forum, Geneva.
Vygotsky, L.S. (1988), Thought and Language, Kozulin, A. (Ed.). The MIT Press, Cambridge MA.
JRF Wack, P. (1985), “Scenarios: unchartered waters ahead”, Harvard Business Review, Vol. 63 No. 5,
pp. 73-89.
21,1
Weick, K.E. (1976), “Educational organizations as loosely coupled systems”, Administrative Science
Quarterly, Vol. 21 No. 1, pp. 1-19.
Wheatley, M.J. (1999), Leadership and the New Science: Discovering Order in a Chaotic World, Berret-
Koehler Publishers, San Francisco.
54
Corresponding author
Jan Emblemsvåg can be contacted at: emblemsvag@yahoo.com
For instructions on how to order reprints of this article, please visit our website:
www.emeraldgrouppublishing.com/licensing/reprints.htm
Or contact us for further details: permissions@emeraldinsight.com