Professional Documents
Culture Documents
Shackle and Modern Decision Theory: M B C Z
Shackle and Modern Decision Theory: M B C Z
1. Introduction
There is huge evidence that the economists perspective on probability has been widening in
recent times. The traditional view is no longer a dogma. On the one hand, the view that
probability can be readily identified with stable relative frequencies of potential economic
events is in dispute. On the other hand, economists have questioned the assumption that
probabilistic concepts (either referring to objective chances or subjectively derived) satisfy a
well-known set of axioms for consistency. In fact, the past few years witnessed an astonishing
increase in the number of studies concerned with informational asymmetries, incomplete
contracts, non-expected utility theory, and so on. These studies do not amount to a mere
application of formal exercises in constrained maximisation; rather, they indicate that the
attitude of modern economic theory towards the question of ignorance is changing (for a
comprehensive survey see Hamouda and Rowley, 1996).
Needless to say, the traditional formulation of the problem of decision under
uncertainty is still dominant. Arguably, the central role played by Bayesian individuals
maximising expected utility in this formulation accounts for its dominance. It is also true that
almost all applied studies are still based on probability density functions, assumed to be
objectively or subjectively given to the actors involved. But, at the same time, an increasing
number of papers in leading mainstream journals are devoted to the study of alternative ways
of formalising uncertainty. Kelsey and Quiggin (1992) provided a comprehensive survey of
these developments and emphasised that their common component is represented by the
recognition that the kind of ignorance individual agents face in actual markets is different
from the rational ignorance usually assumed in Bayesian settings, where the decision-maker
has structural knowledge of the uncertain environment.1 As a result, in the field of theoretical
studies, growing scepticism has generated a number of alternative approaches intended to take
ignorance and vagueness into account.2
This paper discusses the role of George L. S. Shackle in fostering an unconventional
approach to individual decision making. Shackles peculiar place in the history of economics
is attested by the fact that, over the period going from von Neumann and Morgensterns
Theory of Games up to the early 1970s, he was the single critic of the probabilistic approach
1
Even general equilibrium theorists have recently assumed as the standard framework one of
missing markets and impossibility of complete insurance against future events (Magill and Quinzii
1996). Arrow and Hahn (1999) conjectured about the possibility of introducing endogenous
uncertainty into general equilibrium theory through the notion of unawareness.
2
To put it in Hamouda and Rowleys words (1997, xxvi): while many textbooks retain and
stress the notions of probability as established by the beginning of the 1970s, two decades of active
innovation with vague and imprecise alternatives has undermined earlier myopia and produced
some means of translating common forms of imprecision into useful ingredients for modelling
frameworks, and thus generated a less hostile audience for unconventional views of uncertainty and
their application to real phenomena.
to decision making who proposed an alternative formal corpus for dealing with uncertainty.
For Shackle, any effective modelling of actual decision making requires the explicit
recognition of the individuals mental activities, in so far as the future is imagined by each
man for himself and this process of the imagination is a vital part of the process of decision
(Shackle, 1972: 3). Shackles view concerning the probabilistic approach to decision making
is best summarised by the late statement (1972: 15) that the standard (Bayesian) meaning of
probability
stands for a language for expressing judgements as to the weight that the individual
in choosing his conduct ought to give to each of a variety of rival hypotheses
concerning the outcome of some one course of conduct. This language however is not
merely a vessel but a mould. Form and content here, in an essential matter are one. For
the language of (subjective) probability is only capable of utterance subject to an allimportant mental reservation. It assumes, implicitly, that the hypotheses that have been
enumerated, specified and presented for the assignment of weights are the only
relevant ones. Thus the language of subjective probability is confined to the
expression of a certain kind of meaning. And there are other meanings whose
exclusion would be arbitrary and senseless.
The aim of this paper is to analyse Shackles non-probabilistic conceptualisation of
individual decisions under uncertainty from a specific viewpoint, namely, that of a possible
connection between Shackles theory and one of the most interesting recent approaches to
decisions under uncertainty, the so-called non-additive probability approach originated by the
seminal papers of Schmeidler (1989) and Gilboa and Schmeidler (1994).
The first part of the paper presents the rationale of Shackles rejection of probabilistic
decision-making and examines how the Shackleian agent chooses among different courses of
action. Shackles main target (1949a and 1955) was the assumption that individuals act on the
basis of objective probability distributions. To replace that assumption, Shackle devised the
concept of potential surprise and insisted that a criterion for choice alternative to expected
utility maximisation was necessary. Savages systematisation was still to come when Shackle
formulated his original contribution, which first appeared in his 1949 volume entitled
Expectation in Economics. Shackle later contended that his argument was not disproved even
when confronted with the subjective approach to probability. Yet, his viewpoint was criticised
even by critics of Savages approach like Ellsberg (1961) who suggested that beliefs cannot
be represented by additive probability distributions, but insisted that this did not necessarily
entail the rejection of probabilistic reasoning tout court. The second part of the paper
discusses Shackles main argument against the use of probability, namely that it is wrong to
assume that a complete list of the possible outcomes conditioning different courses of action
is known by the agent, and puts it in relation to Ellsbergs critique of Savage. In the third part
of the paper the rationale of Shackles argument is assessed in the light of the recent
developments of modern decision theory.
The focus of the comparison between Shackle and modern decision theory is on the
so-called non-additive probability approach to decisions under uncertainty that was stimulated
by Ellsberg-like empirically observed behaviour (as surveyed in Camerer and Weber 1992)
and, not unlike Shackles analysis, stresses the inability of agents to describe uncertain
environments through additive probability distributions (Gilboa 2004). In this respect, nonadditive probability theory calls to mind the Shacklean assertion that in so far as the future is
the unpredictable consequence of creative choices made by individual agents their
representation of the uncertain environment is necessarily incomplete. However, this theory
shows that the reliance of standard Bayesian theory on probabilistic judgements based on
point-probability estimates, a reliance that Shackle intended to oppose, cannot be viewed as a
justification for dispensing with probability calculus, as soon as room for non-additive
probability distributions is made. A discussion of the pros and cons of the parallel between
Shackles theory and the non-additive developments is provided in the conclusions.
Before moving on it is worth stressing that this paper does not aim to assess Shackles
contribution to economics in general, but the formal theory he proposed to replace the
orthodox approach to individual choice under uncertainty. While this interpretative option
may imply a biased assessment of Shackles thought, since it leaves aside the core questions
of the nature of time and expectations (Shackle 1958 and 1965), it must be noted that the
emphasis Shackle put on decision theory in his early writings was maintained in much of his
later works, including his late major opus, Epistemics and Economics (Shackle 1972).
Moreover, when asked to contribute papers to symposia and collected volumes in the 1970s
and 1980s Shackle was always keen to point to his formal theory of decisions as the main
contribution to the development of a radically subjective approach, as shown by his
contributions to the cross fertilisation between Austrians and post Keynesians (for instance
see Shackle 1976 and 1983).3
See also the 1982 letter from Shackle to the organisers of the conference in his honour held at
the University of Surrey in 1984, reproduced in Earl and Frowen (2000, pp. xvi-xvii). Here Shackle,
asked to indicate which themes he would suggest for the discussion, insisted that it should bear on his
formal representation of the epistemic standing of alternative choices through a measure accounting
for possibility instead of probability.
content. The mainstream standpoint was made explicit by Arrow (1951, p. 417) with the
claim that Knights uncertainties seem to have surprisingly many of the properties of
ordinary probabilities, and axiomatised by Savage (1954), in his contribution to the
explanation of the subjective probability approach. To be sure, this position is still dominant
in textbooks and applied studies (for instance, see Hirshleifer and Riley 1992). But an
increasing number of economists have maintained that the Knightian distinction concerns
fundamental economic problems that cannot usefully be treated by means of a single additive
probability distribution, even though subjectively derived (Kelsey and Quiggin 1992 and
Camerer 1995). These themes have been central to heterodox approaches for long. Most
representatives of a radically subjective approach to decision making, notably Austrians,
Keynesians, institutionalists and evolutionary economists, have typically argued that the way
in which mainstream economic theory deals with decisions under uncertainty cannot take
genuine uncertainty into account (as notable examples see Lawson 1985, Langlois 1994,
and Hodgson 2000).4
In this respect, Shackle (1949a) provided the best formulation of a central analytical
point, regularly taken up by later writers. Granted that the very construction of probability
calculus relies on a certain knowledge of the structure of the world, Shackles contention was
that actual individuals do not have that knowledge. In Shackles view, individual choices are
made between alternatives which are subjective representations of alternative future sequels
to action, not between future sequels themselves: as a result choice is among imagined
experiences. Most economic decisions, Shackle argued (1949b, p. 6), are crucial
experiments, namely situations where the person concerned cannot exclude from his mind
the possibility that the very act of performing the experiment may destroy forever the
circumstances in which it was performed. Shackle (1952 and 1953a) also insisted on the
claim that entrepreneurial activity has to confront uniqueness of most of the choices to be
made: the fact that significant entrepreneurial decisions typically are non-replicable precludes,
in his view, the possibility of applying probabilities.5
The reference is not to all representatives of the schools of thought just mentioned. However
every school mentioned has representatives that emphasise the adjective radical at least as much as the
noun subjective.
5
In Shackles (1952, pp. 23-24) terminology, the decision maker under uncertainty confronts
non-divisible, non-seriable experiments. Divisibility and seriability are the characteristics Shackle
attributed to trials in a controlled environment such as the drawing of a dice. Probability, he argued,
can only deal with experiments constituted by a set of trials of this kind. Non-divisibility and nonseriability were occasionally used by Shackle as synonym for uniqueness and crucialness, but he never
made the distinction sharp enough to be univocal. Moreover, Shackles insistence on the
inappropriateness of applying probability to unique events made his position unappealing in
comparison with both subjective probability theory and logical probability theory, as noted from the
outset by Weckstein (1952-1953).
The essence of Shackles argument, then, was that the use of probability calculus is
inappropriate simply because the conditions appertaining to its application just do not exist in
respect of decisions taken in an economic context, as summarised by his most authoritative
follower J. L. Ford (1990: 21). Shackles argument, however, went beyond realism. Shackle
argued that, once the time dimension of choice was taken into account, accurate knowledge of
the structure of the world became logically impossible. Only seriable experiments can
justify the use of probability weights, but decisions over time are not related to trials repeated
in a set of identical circumstances. Despite Shackles (1961, p. 105) reluctance to admit it,
this argument does not hold in a subjective interpretation of probability a la Savage.6 But
Shackles viewpoint also implies that the individual agent is able to take decisions only
insofar as she creates her own choice set, and the choice-set thus created is necessarily
incomplete. This view demands that the decision-maker is not given an exhaustive list of the
alternatives between which choice is made, that is, is not capable of enumerating all possible
contingencies, or states of the world, as usually assumed after von Neumann and Morgenstern
and Savage. It is worth stressing that the necessity of dealing with a non-complete choice sets
bears a close resemblance to the perspective endorsed by modern critics of Savages
subjectivist approach.7
That the list of alternatives between choice is made in not exhaustive is the main
analytical point at the basis of Shackles theory. To economists working in the above
mentioned heterodox traditions of thought, this point has become an indispensable analytical
reference in their effort to represent decisions under genuine uncertainty.8 On the basis of
this point, Shackle (1949a and 1961) developed a formal theory opposing the Bayesian
approach and intended to capture both the mental processes and the non-repetitive nature of
actual economic decisions. Shackles argument was initially intended to resist the objective
frequency-ratio interpretation of probability, which he regarded as the mainstream view in the
late 1940s. However, he later maintained that the same argument holds with regard to all
those theorists who adopt some concept of numerical probability whether the basis of
their concept of probability is explicitly the notion of frequency-ratio, or whether it is a notion
This point was noted all along by Arrow (1951, p. 15) and has been recalled recently by
Runde (2000, p. 222).
7
For instance, Binmore and Brandenburger (1990: 144) contend that Savages theory applies
only to closed universe, and define a closed universe as one in which all the possibilities can be
exhaustively enumerated in advance and all the implications of all possibilities explored in detail so
that they can be neatly labelled and placed in their proper pigeonholes. Shackle (1953a, p. 36-37)
used the same analogy when he defined an exhaustive system of contingencies as a cabinet of
pigeon-holes such that any conceivable outcome falls into one or the other of these pigeon-holes.
8
In particular, this is the crucial argument upon which the distinction between rational
ignorance and radical ignorance has been drawn in the Austrian tradition (Vaughn 1994). By this
distinction, the Austrians intend to distinguish Savages approach from the Knightian tradition. On this
point see Zappia (1998).
It is worth noting that there is a striking analogy here with Keyness argument against classical
or frequency probability put forward in the Treatise on Probability. Keynes argued that the
magnitude of the probability of two different arguments may be impossible to compare if one of the
arguments is not (weakly) included in the other. As Keynes (1921, p. 38) put it: some probabilities
are not comparable in respect to more or less, because there exists more than one path, so to speak,
between proof and disproof, between certainty and impossibility; and neither of two probabilities,
which lies on independent paths, bears to the other and to certainty the relation of between which is
necessary for quantitative comparison. In order to identify the points on these non-linear paths
Keynes used the term non-numerical probabilities.
impossible.10 It is apparent that potential surprise does not have a metric like that of a
probability measure. The potential surprise values of the various outcomes do not add up to
one, and there is no limit to the to the number of mutually exclusive hypotheses to all of
which simultaneously a person can, without logical contradiction, attach zero potential
surprise (Shackle 1952, p. 31).11 In general, the potential surprise function assumes value
zero for a range of gains and losses near the status quo and (continuously) rises to its
maximum in approaching extreme gains and losses.12
The essentials of Shackles theory are best illustrated in the example of an
entrepreneurs choice between uncertain prospects, for instance, in the acquisition of two
pieces of equipment she should acquire (Shackle 1953a, p. 51-55). Due to the fact that the
investment will not become profitable until a certain date in the future, an analysis of
expectations regarding the competing strategies of investment is needed. Shackle maintained
that the choice of investment cannot be made through a comparison between the expected
discounted value of the streams of returns (after cost of purchase) of the two pieces of
equipment, that is, thorough the maximisation of an expectation function based on probability
distributions over future states of the world. In fact, suppose the entrepreneur is asked to make
an exhaustive list of the specified distinct events which can affect the value of alternative
investments, as required by the application of subjective probability. The entrepreneur,
Shackle (1972, p. 22) contended, will in the end run out of time for its compiling, will realize
that there is no end to such a task, and will be driven to finish off his list with a residual
hypothesis, an acknowledgement that any one of the things he has listed can happen, and also
any number of other things unthought of and incapable of being envisaged before the deadline
of decision have come: a Pandoras box of possibilities beyond reach of formulation.
The potential surprise function then amounts to Shackles substitute for probability
distributions, to be used under uncertainty, that is when crucial decisions are to be taken. It is
worth noting that the distinction between distributional and non-distributional variables shows
that Shackles theory is essentially non-additive, a point he was well aware from the
beginning (Shackle 1949-50). This point was discussed by some of his critics in the 1950s,
who made explicit reference to the option of retaining probability, but dispensing with the
10
property of additivity, instead of dispensing with probability as such. This option was left
largely unaddressed by Shackle,13 but the use of potential surprise as an inverted expression
of epistemic standing is instrumental to the construction of a non-additive index, because, as
stated by Shackle himself even in a late assessment of the theory (1986, p. 287), by this
inversion of measurement we rid ourselves of the crippling additive character of probability,
inherited from its origin in games of chance.
Shackles next step was to apply his novel epistemic measure to decision making in
order to derive a criterion for choice. As just seen, the decision-maker has to choose among
alternatives on the basis of two elements: the possible gains and losses embedded in a
sequel to a specific action, called face-values, and a valuation of the possibility of the
gains and losses, the potential surprise. The latter element represents the degree of disbelief,
or implausibility of the hypothesis that supports the sequel, ranging from 0 (absence of
disbelief) to a maximum value expressing impossibility (absolute disbelief). The decisionmaker who chooses among alternative sequels is supposed to reconsider the face-values of
each sequel by their degree of potential surprise.
But now, Shackle (1953a, p. 47) summarised, because the project is a non-divisible
non-seriable experiment, his [the entrepreneur] various hypotheses as to its outcome are
mutually exclusive and therefore there is here no logical basis for the additive procedure by
which a mathematical expectation is assigned to a divisible experiment.14 Furthermore,
from a mathematical viewpoint, the traditional procedure is not available, because the
substitute for the probability measure, the potential surprise function, cannot be integrated in
13
the ordinary sense of the Riemann integral. Shackle then proposed to concentrate attention on
those two or few of them [hypotheses] which out-rival all the others in their power of
stimulus, that is, he recommends concentrating on focus-values.15
In order to do so, Shackle defined a function , called the ascendancy function,
whose arguments are the face-values and the associated potential surprises implicit in a
sequel. The ascendancy function indicates the degree of stimulus that each outcome /
potential surprise pair triggers on the decision maker. As a result the decision maker is
assumed to be endowed with preference curves in the outcome / potential surprise plane,
reflecting her attitude toward the uncertain environment. On this ranking of outcomes,
Shackle superimposed the particular potential surprise function that the decision-maker allots
to a particular project. As a result, the ascendancy function is used to select the element or
elements upon which the decision-makers whole attention will be concentrated (Shackle
1961, p. 145).16
Following this procedure Shackle (1953a, p. 46) determined the highest bids which
the particular project in question can make for the decision makers attention and interest.
One of these bids is the most powerful suggestion of success and the other the most powerful
suggestion of disaster that the conception of project conveys. These extreme values represent
the limits of all possible outcomes of any feasible sequel, after the outcomes and their
potential surprise are valued in terms of the attitude the decision-maker shows towards the
uncertain situation. Finally, the criterion for choice Shackle proposed amounts to a rule of
thumb by which the decision maker takes into account both the best possible and the worst
possible outcomes of each sequel, respectively called focus-gain and focus-loss.17
This way of formulating a criterion for decisions can be phrased in the language of
modern decision theory. Shackles decision-maker first orders prospect revenues of an act
15
Shackle (1949-50) was aware that his proposal may have failed to gain acceptance because the
adoption of a non-additive procedure was really unconventional: even people who have never asked
themselves how frequency-ratio probability can possibly have any relevance or meaning for the
analysis of uncertainty (i.e. ignorance) about the outcome of a non-divisible isolated experiment, are
predisposed towards some sort of additive solution, because it is by an additive procedure that we
reach knowledge of the outcome of divisible experiment. But, Shackle wondered, if we discard the
notion of frequency ratio probabilities, ought we not also to discard our predisposition towards
additive solutions and start afresh with an open mind? (Shackle 1949-50, p.74).
16
Shackle assumed that the ascendancy function, (y(x),x), reflects a power to stimulate the
mind that is increasing as the possible gain/loss increases and decreasing as the potential surprise of
the gain/loss increases. The function is then maximized both for losses and gains with respect to the
potential surprise function. See Ford (1990).
17
It must be noted that the ascendancy function has a role that is analogous to the Keynesian
weight of argument, since different degrees of stimulus select different focus values for the same
potential surprise function. Or, to put it differently, the potential surprise function is the (non-additive)
substitute for the (additive) probability function, but its significance in the decision process is
mediated by the ascendancy function. Among the followers of Shackle, Levi (1972) is possibly the
only one to insist on this point.
(sequel) on the basis of both their value and their degrees of possibility, and then she reevaluates them by her specific (that is, relative to the project at hand) attitude towards the
uncertain environment. At this point, she takes into account the best and the worst outcomes
involved in the feasible act, instead of calculating an expected value of the act. Eventually
acts are ranked in terms of the best/worst pairs.
Shackle was well aware that he was proposing a heresy. Those accustomed to think
in terms of the actuarial calculation of the result of a divisible experiment, Shackle (1953a,
pp. 42-43) admitted, will find this point exceedingly hard to appreciate. Nonetheless
Shackle was clear that he was suggesting a procedure alternative to von Neumann and
Morgensterns maximin criterion, which Wald (1945) had introduced as a decision rule for
decisions under ignorance, that is, when the decision-maker is unable to specify a probability
distribution over the possible outcomes.18 Shackle (1953a, p. 43) first observed: the
fundamental hypothesis of von Neumann and Morgensterns Theory of Games is that a
decision-maker will choose that action (strategy) whose worst possible outcome is the least
bad amongst the respective worst possible outcomes of all the actions open to him. Then he
added: when players do not have the comprehensive, exact and certain knowledge assumed
to hold in von Neumann and Morgensterns setup, would the von Neumann and Morgenstern
hypothesis be more plausible than that the decision-maker will choose that action whose best
possible outcome is the best amongst the respective best possible outcomes of all the actions
open to him? With an unconventional twist, Shackle was maintaining that a maximax
criterion should be considered equally plausible, in case a single probability distribution is not
available to the decision-maker. But, he then concluded:
there is surely a third [criterion] which is more plausible, and of more general
analytical power, than either of the two former, namely, that he [the decision-maker]
will take into account both the best possible and the worst possible outcome of
each course of action and make these pairs of outcomes the basis of his decision.
This excerpt from one of Shackles early papers shows that he was receptive of the
contemporary developments in decision theory in the 1950s. The discussion about decision
rules was lively in the early 1950s and concentrated on situations in which a probability
distribution over the set of states of nature was not known. Luce and Raiffa (1957: p. 278)
18
Wald (1939) had examined the decision problem of a statistician who has to select the
distribution function generating a sequence of chance variables, when she knows that the distribution
function belongs to a given set of distribution functions, but there are no judgements of probability
nature which make it possible to discriminate among elements of the set. Wald proposed as a criterion
for choice to minimise the maximum risk of accepting any hypothesis about the actual distribution
function. Following the appearance of von Neumann and Morgensterns Theory of Games, Wald
(1945, p. 279-280) then analysed this decision problem as a two-person zero-sum game where the
decision maker plays against Nature and showed that the maximin strategy is a best response against
the least favourable a priori distribution Nature can select in choosing her strategy.
10
devoted an entire chapter to the issue of which decision rule should be considered in decision
making under uncertainty, that they identified as a situation in which the decision maker is
completely ignorant as to which state of nature prevails.
Shackles theory was widely discussed in the 1950s. Arrow (1951) singled Shackles
theory out as the single formalised one among the theories aiming to discard the probability
framework in their description of behaviour under uncertainty, in which Arrow included both
Knight and Keynes. Despite Savages Foundations of Statistics (Savage 1954) started
spreading its lasting influence, Shackles theory continued to be discussed in major journals
as one of the main alternatives to the then consolidating mainstream. The theoretical
economists, psychologists, and mathematicians who participated in this discussion, mostly
interested in the viability of a formally structured alternative to probabilistic theories of
behaviour under uncertainty included Baumol, Carter, Edwards, Georgescu-Roegen, Gould,
Houttaker, Niehans, and Weckstein among others. The debate over Shackles approach
culminated in a conference on uncertainty and business behaviour held at the Carnegie
Institute of Technology in 1955 (Bowman 1958) where Shackle was given the honour to
present the key theoretical paper. In 1959, an entire issue of Metroeconomica was dedicated
to a symposium on Shackles work.
But the formal apparatus was largely regarded as both lacking the virtue of
straightforwardness and arbitrary in the process of eliciting focus-gains and focus-losses.19
Moreover a consistent axiomatic foundation for the theoretical structure was not offered.
Eventually, while the methodological arguments of Shackle continued to be referred to by
critics of mainstream decision theory, the formal theory sunk to oblivion.20 The next sections
of this paper show that, though Shackles emphasis on alternative decision rules resulted in an
idiosyncratic view, it was more sound than admitted for long.
Arrows mostly dismissive comments are exemplar of a shared attitude. After reviewing
Shackles construction, Arrow (1951, p.38) commented: Shackle has the sound impulse to base a
theory of uncertainty-bearing on the necessity of the human mind to simplify a problem however,
his particular simplification seems to be purely arbitrary. This theory is not based on consideration
of rational behaviour, which Shackle specifically rejects, but on an alleged inability of the mind to
consider simultaneously mutually exclusive events. Furthermore, as noted by Ford (1990, p. 40) the
fact that the ascendancy function does not encapsulate the information contained in the whole range of
potential surprise function may imply that of two strategies evaluated on the bases of focus-values
only, a strategy is chosen that is stochastically dominated by the other.
20
For a notable exception, see Fords (1983) attempt to axiomatise Shackles theory.
11
probability theory to individual beliefs. This approach centres around two fundamental
assumptions. First, a complete list of possible future states of the world is available to the
individual a list which, in an interpersonal context, is common knowledge to all individuals.
The individual is endowed with subjective beliefs over the state space. These beliefs are
represented by a well-defined (additive) probability function. More precisely, the subjective
probability distribution of the individual is derived from axioms on a preference ordering over
uncertain prospects. As a result, individuals in uncertain settings are supposed to be able to
undertake expected-cost/expected-benefit analysis in information gathering, and hence to
reach an informational optimum. This is why Savages individuals are termed
probabilistically sophisticated individuals (Machina and Schmeidler 1992). As shown in the
previous section, this first assumption is a main target of Shackles critique.
The second fundamental assumption of probabilistic decision making has to do with
the cognitive capabilities of the decision-maker. The processing of information consists in a
Bayesian process of updating individual beliefs (prior probability distribution), when she
receives a signal on the realisation of the state. This assumption follows from the implicit
hypothesis that individuals are rational in a strong sense, namely, that they manage to deduce
all logical propositions contained in the axioms of the theory. Of course, this second
assumption is highly questionable after Simon (1982), and it has been abandoned in bounded
rationality and evolutionary models since Nelson and Winter (1982). However, computational
and cognitive problems will not be discussed here, since these were outside Shackles
research field, at least as far as the formal structure of his analysis was concerned.21
Obviously enough, Savages representation theorem is of the utmost importance in the
development of modern decision theory. Von Neumann and Morgenstern (1947) represented
uncertainty by means of explicit probabilities, so that the objects of individuals choices
consisted of lottery prizes, with given objective probability distributions over outcomes.
However, real-world uncertainty mostly consists of alternative events (or states), and choices
are typically bets which assign outcomes to these events. Savage derived the principle of
expected utility maximisation from a number of axioms over acts, that is, consequences that
depend on which of several uncertain states occurs. In Savages approach if preferences
satisfy certain axioms then there are numerical utilities and probabilities that represent acts by
their subjective expected utility. In particular, in the tradition of the choice-theoretic approach
to subjective probability developed by Ramsey and de Finetti, Savage put forward an
axiomatic framework in which the subjective probabilities of individual are elicited from
choices and satisfy the requirement of consistency. It must be stressed that the appropriateness
of a subjective interpretation of probability rests on the claim that subjective probability
distributions satisfy the rule of additivity. De Finetti introduced a consistency argument by
21
12
arguing that an individuals betting rates could not violate this rule, because if it could another
individual would take advantage of the situation. This argument is known as the Dutch
book argument: a Dutch book is a sequence of bets with a non-positive outcome in every
state of the world and a negative outcome in at least one state. It can be shown that an
individual acting on the basis of a unique, additive probability distribution cannot be induced
to accept a Dutch book.22
A first, partial step to move on towards a more convincing representation of
uncertainty can be sketched as follows. It is helpful to retain the Bayesian assumption that, in
principle, individuals are able to formulate a subjective probability distribution which enables
them to cope with any possible uncertain situation. What is questioned is either the reliability
of this distribution or the fact that the distribution is unique. The issue of the reliability of the
subjective probability distribution emerged as a result of experimental evidence, which
revealed systematic violations of Savages axioms, and in particular of the axioms on which
basically hinges consistency, the so-called sure-thing principle.23 The most discussed of such
violations are the Allais Paradox (Allais 1953) and the Ellsberg Paradox (Ellsberg 1961).
The challenge posed to the expected utility theory by the Ellsberg Paradox is the
crucial one as regards our reconstruction in view of the fact that it focuses on the belief side of
the decision problem and involves considerations about the individuals degree of confidence
in the probability assessment.24 Ellsberg reported that probabilities derived from choices may
not capture all the aspects of the individual attitude in uncertain environments. It appeared
that, even in relatively simple decisional contexts such as urn problems, confidence in
estimates of subjective probabilities was taken into account.25 The results of Ellsbergs
22
De Finetti (1974) assumed that the individual should be offered odds not only on certain
events or proposition, but also on the set of events covering the entire space state. The individual
should also be ready to accept either side of a bet: that is, she should be prepared to be either for or
against any given event at given odds. While it seems more plausible that individuals are willing to
take only one side of a bet, the argument for additivity of the subjective probability distribution does
not apply to one-side bets (Shafer 1986).
23
As is well known, the sure-thing principle requires that, when choosing between acts, the
decision-maker does not take into account those states in which the acts yield the same consequence.
This axiom corresponds to the independence axiom in von Neumann-Morgensterns framework.
24
The Allais Paradox is a seminal counter-example concerning the validity of the expected
utility theory intended to show that people may fail to maximise expected utility even in risky
situations, that is, when probabilities are known. However, this paradox does not necessarily
undermine the concept of probability per se. Indeed initially it was considered of limited significance
from a normative point of view because the original set-up was based on payoffs that are out of the
ordinary (see for instance Morgensterns (1979, p. 180) reaction). The consensus that saw Allaiss
results as mistakes rational agents simply cannot make faded away after Kahneman and Tversky
(1979) showed that examples of distortion of objective probabilities by individuals choosing among
risky prospects emerged in many different settings. On the genesis of Allais Paradox see Jallais and
Pradier 2005.
25
Ellsberg formulated two experiments: a one-urn experiment and a two-urn experiment. In the
one-urn experiment an individual faces an urn containing 30 red balls and 60 balls in some
13
famous one-urn experiment, revealing a preference for betting on known rather than on
unknown probabilities, proved to be inconsistent with Savages sure-thing principle.
Moreover, the beliefs of the individual exhibiting these preferences could not be represented
through an additive probability distribution.26
As a result, these preferences contradict not only the expected utility theory, but also
every other theory of rational behaviour under uncertainty that assumes a unique additive
probability measure underlying choices. In Ellsbergs (1961: 654) words, it is impossible, on
the basis of such choices, to infer even qualitative probabilities for the events in question
[and] to find probability numbers in terms of which these choices could be described even
roughly or approximately as maximising the mathematical expectation of utility.
The violations of the sure-thing principle pointed out by Ellsberg in his hypothetical
experiments have been confirmed by many laboratory experiments replicated in recent years
(Camerer and Weber 1992, and Luce 2000).27 Experimental subjects have confirmed that the
individuals willingness to act in the presence of uncertainty depends not only on the
probability of the event in question but also on its vagueness. As a matter of fact, the
experiments suggest that most agents prefer making unambiguous choices rather than
ambiguous ones, and that individual choices can be affected by the nature of ones
information concerning the relative likelihood of events. What is at issue, Ellsberg (1961:
657) clarified, might be called the ambiguity of this information, a quality depending on the
amount, type, reliability and unanimity of information, and giving rise to ones degree of
confidence in an estimate of relative likelihood.
combination of black and yellow; there is no information whatsoever about the respective number of
black and yellow balls in the urn (unknown proportion). A ball is drawn from the urn. There are two
pairs of acts, X and Y, and X and Y. Acts have consequences of 1 or 0 as follows: choosing X one
gets 1 if the ball is red and 0 if it is black or yellow; choosing Y one gets 0 if red or yellow and 1 if
black, choosing X one gets 1 if red or yellow and 0 if not; choosing Y one gets 0 if red and 1 if black
or yellow. Ellsberg reported that, among those asked, most people chose X instead of Y, and Y
instead of X, thus revealing a remarkable preference for betting on known probabilities of winning.
But both pairs of acts have different consequences only when the yellow state occurs, and these
consequences are the same both for X and Y (the individual gets 0) and for X and Y (the individual
gets 1).
26
In the experiment referred to in the previous footnote, suppose p(r), p(b) and p(y) are the
subjective probabilities of each possible draw, with p(r)=1/3. Setting U(0)=0, Savages subjective
expected utility implies that X is to be preferred to Y if and only if p(r)U(1)> p(b)U(1) or p(r)>p(b).
Likewise Y is preferred to X if and only if p(by)>p(ry). This contradicts the assumption that
probabilities are additive: in fact, given p(by)=0, if p(by)=p(b)+p(y) then to prefer Y to X implies
p(b)>p(r)=1/3, which conflicts with what is implied by preferring X to Y, that is, p(b)<p(r)=1/3. As a
result the individual acting as reported by Ellsberg assumes p(by)>p(b)+p(y).
27
Ellsberg proposed his choice experiment to a number of well-known decision theorists,
including Savage himself. Savage was reported by Ellsberg (1961, p. 656) to be in the group of those
who violated the axiom and wished to persist in their choice. Savage had violated Allais Paradox as
well, but in this case he clarified that, upon reflection, he regretted to have done it (Savage 1954, p.
103).
14
This point is best summarised by Hamouda and Rowley (1996, p. 49) as follows: After three
decades of the presentation of his paradox, we should acknowledge the commonsense in Ellsbergs
two modest suggestions that we avoid E(xpected) U(tility) axioms in the certain, specifiable
circumstances where they do not seem acceptable and that we should, in these circumstances, give
more attention to alternative decision rules and non-probabilistic description of uncertainty.
29
Notably Heath and Tversky (1991) showed that not knowing important information is
upsetting for experimental subjects and can make them shy away from taking either side of a bet, a
fact which add doubts to the rationale under the Dutch book argument.
15
rationale under the refusal of scholars working in the Keynesian and Austrian traditions to
discuss Ellsbergs point as a relevant basis for formalised alternatives to the mainstream. But,
as will be detailed in the next section, there exist now studies originated from the Ellsberg
Paradox which imply a different interpretation of the question at stake. Moreover, a number
of aspects indicate that an alternative view about the similarities between Shackles and
Ellsbergs approaches can be put forward.
First, Ellsberg intended to challenge Savages view that individuals beliefs about
unknown states of the world can be adequately represented by additive probability
distributions. In fact, the behaviour reported by Ellsberg is consistent with beliefs that do not
satisfy additivity.30 Shackles point that probability is a non-distributional variable hints at the
same question. As noted in the previous section, Shackles distinction between distributional
uncertainty variables, which can be used if the list of conceivable events is exhaustive, and
non-distributional variables, which should be used when the list is known to be incomplete
(e.g., when the emergence of an unanticipated event is deemed possible), reflects an
essentially non-additive characteristic of his theory. In fact, Shackle distinguished between
the cases in which it makes sense to use frequency-ratios probabilities as a measure of
acceptance, and so the appropriate procedure is the additive one, and crucial decisions,
which require a different approach. To tackle the latter, Shackle (1949-50, p. 71) contended,
the traditional procedure of adding up the probabilities of alternative outcomes is an
irrational compromise. Contrary to Shackles view, Ellsberg insisted that the dismissal of
additivity does not entail the abandonment of probability measures in the representation
beliefs.31 But both Ellsberg and Shackle used the issue of non-additivity to justify the need for
criteria for choice alternative to the maximization of expected utility, as re-affirmed by von
Neumann and Morgenstern and Savage in the tradition of Daniel Bernoulli and classical
probability.32
30
16
A second aspect linking Ellsbergs and Shackles approaches is the following. While it
is true that the urn problems originating Ellsberg Paradox do not allude to a lack of clarity in
the description of events and consequences of actions, Ellsbergs challenge to Savages
approach is originated by the idea that decisions under uncertainty imply a fuzzy perception
of the likelihood of events on the part of individual agents. After discussing the urn problems,
in fact, Ellsberg noted that the ambiguity surrounding the outcome of a proposed innovation,
a departure from current strategy, may be much more noticeable, thus hinting at uncertainty
proper. The decision-maker which Ellsbergs analysis refers to is an individual whose
subjective knowledge about the environment is inconsistent with a single additive probability
distribution. Her subjective knowledge may be represented by more than one probability
distribution, but she is not endowed with a unique second-order probability distribution over
the set of possible probabilities, as discussed by Marshack (1975).33 Similarly, Shackle
justified the idea that a decision maker is forced to simplify the expectational elements she
confronts by the fact that a crucial decision cannot be taken as if the (subjective) probabilities
associated with possible events conditioning this decision were available. In a situation in
which many, in principle infinite, priors are possible the Shacklean entrepreneur does not
single out one prior, as subjectivists like Ramsey and Savage would recommend, and reacts
by reducing the possibilities she faces to a kind of point estimate for the imagined gain and
loss associated with the strategy.
As an aside to this second similarity between Shackle and Ellsberg a brief reference to
Keyness work is worth adding. Normally, the Keynesian origins of Ellsbergs approach are
not given adequate importance. But these are documented in the 1962 doctoral thesis, in
which Ellsberg made ample reference to Keyness Treatise on Probability. Ellsberg (2001,
Ch. 1) admitted that his notion of ambiguity was closely related to Keyness notion of weight
of argument, and in particular to Keyness insistence that the weight of argument makes the
comparison of probabilities impossible in certain cases. Moreover he mentioned that Keynes
(1921, 53-54) had discussed the two-colour two urns example with the intent to give a critical
account of the limits of the principle of insufficient reason, in accordance with the idea that in
33
Even if he did not tackle it, Savage was aware of the relevance of this issue when he stated:
there seem to be some probability relations about which we feel relatively sure as compared with
others. When our opinions, as reflected in real or envisaged action, are inconsistent, we satisfy the
unsure options to the sure ones. The notion of sure and unsure introduced here is vague, and my
complaint is precisely that neither the theory of personal probability, as developed in this book, nor
any other device known to me renders the notion less vague. There is some temptation to introduce
probabilities of a second order But such a program seems to meet insurmountable difficulties
[because] once second order probabilities are introduced, the introduction of an endless hierarchy
seems inescapable (Savage 1954, pp. 57-58). Ellsberg (1961, p. 658) conceded that Savage himself
alludes to this sort of judgement and notes as a difficulty with this approach that no recognition is
given to it. However, Ellsberg also compared Savages point with Knights perspective. According to
Ellsberg, Knight asserts what Savages approach tacitly denies, that such overall judgment may
influence decisions.
17
34
18
x the index E(x)+(1-)min(x), and then choose the act associated to the maximum value of
the index.36
In the 1962 thesis Ellsberg discussed a different criterion to which he claimed did not
pay enough attention in the QJE essay, Hurwicz criterion. Hurwicz (1951) too had introduced
a parameter intended to obviate to the pessimistic attitude of Walds suggestion to concentrate
only on states having the worst consequences. Hurwicz criterion selects the minimum, m, and
the maximum, M, payoff to each given action and then associates the index M+(1-)m to
each action. Of any two actions, the one with the maximum index would be preferred.37 In his
doctoral thesis then Ellsberg retained the idea of a set of distributions over the states of the
world, in accordance with Hodges and Lehmann, but applied the Hurwicz criterion to the
restricted set of possible distributions. The new index, that he called restricted
Bayes/Hurwicz criterion, was: E(x)+(1-)[max(x)+(1-)min(x)]. Ellsberg (2001, p. 195)
presented a taxonomy of the possible criteria in terms of the two parameters and , that is in
terms of the degrees of confidence and optimism, respectively, characterising the epistemic
state of the decision-maker.
In his discussion of Hurwicz, Ellsberg (2001, p. 163, 190) noted the analogies of
Hurwiczs rule with Shackles analysis. Ellsberg claimed that Shackles decision rule amounts
to a generalization of Hurwiczs criterion: the Hurwicz rule corresponds to a special case in
which the indifference sets in Shackles gambler indifference map are parallel straight lines,
with slope dependent on . In Ellsbergs thesis, thus, it is argued for the first time that
Shackles criterion for ranking acts is nothing but a generalized version of Hurwicz criterion,
or, to put it correctly, that Hurwiczs decision rule is but a special case of Shackles. As noted
by Levi (2001, xxiii) in his introduction to the recent print of Ellsbergs thesis, it is regrettable
that Luce and Raiffa did not note this in their influential survey of decision making under
uncertainty (Luce and Raiffa, 1957, Chap. 13). In fact, Hurwiczs criterion, better known as
Arrow-Hurwicz criterion after being restated in a joint paper by Arrow and Hurwicz twenty
years later (1972), has originated the modern literature on decision making under complete
ignorance (see for instance Kelsey 1993).38
36
Where E(x) is the expected payoff to the act corresponding to the best guess distribution,
min(x) is the minimum expected payoff to the act as the probability distribution ranges in the set of
non-unacceptable distributions, like in Wald (1945), and represent the degree of confidence in the
best guess distribution. Following Hodges and Lehmann, Ellsberg called the criterion restricted
Bayes. The adjective restricted hints of course to the set of plausible priors.
37
The parameter represented the degree of optimism used by the decision maker in evaluating
alternative acts. If =0 Hurwiczs criterion is Walds maximin, while if =1 it is the maximax
criterion.
38
A puzzling element concerning the historical reconstruction is worth adding. The opening
footnote Arrow and Hurwiczs 1972 paper acknowledged that Hurwicz had attained the basic result of
the essay in 1951, while Arrows contribution (circulated in a working paper in 1953) was said to
consist in a considerably simplified proof of Hurwiczs result. The 1972 version, Arrow and Hurwicz
added, did not represent a major development over the working papers of the 1950s. Moreover Arrow
19
and Hurwicz (1972, p. 2) recognised in the paper that Shackles criterion could be interpreted as a
version of Hurwiczs criterion. But this is conceded only 20 years after the dismissive comments
originally provided by Arrow himself (1951).
39
For instance, in the three-color example reported in the previous section, individuals prefer
acts with a known probability of winning. Like Ellsberg, Schmeidler (1989, p. 571) did not find the
Bayesian assumption of a single additive probability distribution plausible because in many situations
there is not enough information to generate the Bayesian prior: as a result the probability attached to
an uncertain event does not reflect the heuristic amount of information that led to the assignment of
that probability.
40
It is worth stressing here that the representation of beliefs through real-valued set functions
which do not necessarily satisfy additivity is not new. In particular, belief functions were introduced
by Dempster (1967) and Shafer (1976). Although these theories were not directly related to decision
under uncertainty, it turned out that beliefs functions were a special case of the convex capacities
discussed in this section (Gilboa and Schmeidler 1994). Shafer motivated the introduction of belief
function on a constructive ground, but attached no decision theory to it. Shafer referred to Shackles
theory as a special case of belief function theory (on the Shackle-Shafer connection, see Fioretti 2004).
20
Consider a decision problem in which the states of the world included in the model do
not exhaust the actual ones. A description of the world can be considered as a mis-specified
model if certain states are not explicitly included. If an individual agent does not know how
many states have been omitted, her beliefs can be represented by a non-necessarily-additive
measure or a capacity. A capacity, , is a monotone measure which, like probability, is
normalised to 1 on the full set and 0 on the null set. But, unlike probability, the sum of the
capacities of two subsets may be different from the capacity of the union of the same sets.41 If
the sum is strictly less than the capacity of the union, the capacity is said to be convex.42 The
convexity of the capacity is a property suggested by the Ellsberg Paradox. As shown in the
previous section, the Ellsberg Paradox demonstrates that the belief in the (unambiguous)
event of drawing a black or a yellow ball strictly exceeds the sum of the beliefs in the
(ambiguous) events that a black ball is drawn or a yellow ball is drawn. In other words, to
reflect the individuals attitude to uncertainty the model includes the possibility that the
subjective probability attributed, for instance, to two mutually exclusive events does not
necessarily add up to 1. This provides a base for dealing with situations in which the
uncertainty of the individual involves the existence of a third event, whose occurrence had no
probability attached at the outset.
But since is a non-additive measure, the integration of a real-valued function with
respect to is impossible and expected utility can no longer be computed. Schmeidler (1989)
showed that the proper integral for a capacity is the Choquet integral and proposed that
decision-makers behave as if they maximise the Choquet integral of their utility function. The
Choquet integral with respect to a capacity was known in the mathematical literature since
Choquet (1954) as a generalisation of the Lebesgue integral to a non-additive measure.
Schmeidler applied the concept to decision under uncertainty and used the Choquet integral of
the non-additive measure as a generalisation of the mathematical expectation usually used in
expected utility models. This new procedure is usually referred to as Choquet expected utility
(CEU). Furthermore Schmeidler provided an axiomatic foundation for CEU by means of a
representation theorem that, in the same vein of Savages approach, made it possible to
identify the non-additive probability uniquely, and the utility function up to a positive linear
transformation.43
Likewise, Zadehs (1978) theory of fuzzy sets has been shown to be compatible with the non-additive
probability approach (Wakker 1990).
41
In a more formal way, let ={w1,...,wn} be a non empty set of states of the world and let S=2
be the set of all events. A function :SR+ is a non-necessarily-additive probability measure, or
capacity, if ()=0 and ()=1, and if for all s1,s2S such that s1s2, (s1)(s2).
42
That is, a capacity is convex if for all s1,s2S such that s1s2, s1s2S, (s1s2) (s1)+(s2)(s1s2).
43
For a non-technical introduction see Gilboa (2004). The Choquet integral of a function f with
respect to a capacity is as follows:
21
44
A number of important applications have been proposed in the fields of financial markets
(Epstein and Wang 1994, and Mukerji and Tallon 2001), macroeconomics (Mukerji and Tallon 2004),
and in game-theoretical contexts (Eichberger and Kelsey 2000). For a review of these developments
see Basili (2001) and Gilboa (2004).
22
embedded in a decision model with an additive measure (Gilboa and Schmeidler 1994). In
this model, the enlarged state space includes all possible missing states (namely, the
objectively given states in the sense of Savages state space). As a result, it is possible to
relax the non-additivity of a measure at the expense of the dimension of the decision model.
In the representation of uncertainty through a non-additive measure on the space state, a
relationship between the epistemic status of the individual (that is, her awareness of both
incomplete knowledge and the limited reliability of likelihood assessments) and her choice is
implicitly assumed. Mukerji (1997: 25) clarifies this relationship by means of a two-tiered
state space model which embeds a space on which the individual assigns primitive beliefs
and a space of payoff relevant states, i.e. states on which the available acts are directly
defined.45 At first the individual assigns her beliefs on her state space perceived as simpler,
called the primitive world. Then she infers beliefs about the events to which the outcomes of
acts are directly related. The inferred space is called derivative world. It is straightforward to
interpret the primitive and derivative worlds as Savages small and grand worlds,
respectively.46
The primitive state space (the small world) is a set of objects of which the individual
has direct experience, clear intuition and empirical knowledge. Belief assessments on this
state space express this confidence. Likelihood assessments on the derivative world (the grand
world) are deduced from an implication mapping which represents the individuals
knowledge of the association between the two worlds. As a result, the decision-makers
knowledge about the likelihood of an event in the derivative frame is constituted by the sum
of the beliefs assigned to those elements of the primitive frame whose implications are subevents of the event in question. Depending on the epistemic condition of the individual, the
beliefs on the derivative frame may have a non-additive representation. In fact, if the
individual transfers a likelihood assigned to an event in the small world to an event in the
grand world, the implication is that she is unable to distribute beliefs across the elements of
the grand world. It follows that the non-additivity of subjective probability measures becomes
45
See Gilboa and Schmeidler (1994) for the formal counterpart of this analysis. Mukerjis twotiered state space is mathematically isomorphic to the enlarged space of Gilboa and Schmeidler.
46
In Savages theory the grand world is the complete list of states which are of concern to an
individual. The small world is a construction derived from a partition of the grand world into
subsets, or small-world states, which are subsets, or events, of the grand world. Savage (1954: 9)
maintained that an individual has to confine her attention to a relatively simple situation in almost all
her decision; that is, in practice, the individual is concerned with a small world, which is derived
from a larger by neglecting some distinctions between states, not by ignoring some [grand-world]
states outright. By considering a small world as crucial for her decision, the individual describes
states of the world and consequences in limited detail. It is worth noting that the individual can
consider progressively a more refined and detailed small world, until she arrives to the grand world
which takes everything into account. But Savage (1954: 16) added: to assume that one envisages
every conceivable policy for the government of his whole life (at least from now on) in its most
minute details, in the light of a vast number of unknown states of the world is utterly ridiculous.
23
As Savage (1954: 82-84) was keen to claim, subjective expected theory should be applied to
small worlds only, that is, when all the possibilities can be exhaustively enumerate in advance, and all
the implication of all possibilities explored in detail so that they can be labelled and placed in their
proper position. He stressed the practical necessity of confining attention to, or isolating, relatively
simple situations in almost all applications of the theory of decision developed in his Foundations. In
fact, Savages rejection of the positive critique to his theory, namely that real people frequently and
flagrantly behave in disaccord with utility theory, is mainly based on the distinction between the
grand and the small world (Savage 1954: 100-101). Savage also admitted that a small world is
completely satisfactory only if it is actually a microcosm, that is, only if it leads to a probability
measure and a utility well articulated with those of the grand world (Savage 1954: 88).
24
making, like in his urn problems. Shackle introduced the focus-gain / focus loss pair, which
contains Hurwicz criterion as a special case, if the focus values are interpreted as the
maximum and minimum payoffs to a given action. Arrow and Hurwicz (1972, p.2) recognised
that their arguments and conclusions are much closer to Shackles (1949) than those of
Ramsey (1931), de Finetti (1937) and Savage (1954).
As noted earlier the Arrow-Hurwicz criterion has originated the modern literature on
decision making under complete ignorance (Kelsey 1993, and Barrett and Pattanaik 1993).
Savages subjective expected utility theory applies to situations in which the individual acts as
if she had a single reliable probability distribution over the states of the world (elicited from
choices). Conversely, Arrow and Hurwicz discussed the case in which the individual has no
idea at all about the likelihood of states, the case formerly introduced by Wald. More recently
Gilboa and Schmeidler (1989) proposed an alternative to subjective expected utility designed
to encompass the case between these two extremes, that is the case in which the individual
has opinions about the likelihood of different states, but she is not able to assign exact
probabilities to them. According to their theory the decision maker has a convex set of
subjective probability, which expresses the range of probabilities she considers possible.
Since the subjective probability is not unique there is a set of expected utilities for each
action. Gilboa and Schmeidler then propose the following criterion: an action a is preferred to
b if and only if the minimum possible value of the expected utility of a is greater than the
minimum expected value of b. Following on Gardenfors and Sahlin (1982), they called this
criterion maximin expected utility (MEU). If the set of probabilities consists only of a single
probability distribution, maximin expected utility coincides with subjective expected utility.
If, on the other hand, it consists of all possible probability distributions it coincides with
Walds maximin.
There is a close relationship between MEU and CEU that needs emphasising. Gilboa
and Schmeidler (1994) showed that there is an isomorphism between a non-additive
probability measure and a convex set of additive probability measure. If the convex capacity
that represents the decision-makers beliefs has a non-empty core of additive measures, than
the CEU with respect to the capacity equals the maximin expected utility of the set of additive
measures, that is, the subjective expected utility with respect to the less favourable probability
distribution in the core. More recently this relationships between a set of multiple priors and a
non-additive probability has been generalised in order to overcome the extremely averse
attitude expressed by MEU (and CEU). MEU, in fact, suggests that the non-additive
probability is the lower bound of what the real additive probability might be, and
concentrates on the worst cases, regardless of any consideration relative to the individuals
degree of confidence in her probability assessment.
This limitation is overcome by a generalized version of MEU called -maxmin
expected utility (-MEU). In this theory a crucial role is played by the parameter [0,1],
25
which expresses, exactly as in Arrow-Hurwicz, the decision makers ambiguity attitude. The
-MEU emphasises the decision makers degree of ambiguity perception, and involves a
valuation of the quality of the decision makers information. In a recent paper Girardato,
Maccheroni and Marinacci (2004) establish a connection between the opposite valuations (the
best and the worst) of each act, given a set of relevant probability distributions (priors). In the
-MEU model the functional shows a sandwiching property, because it is placed between the
worst and the best scenario evaluation of the decision maker. Straightforwardly, the assumed
constant function is the ambiguity aversion index: the higher (lower) it is, the bigger
(smaller) the pessimism in the decision makers evaluation is. If is equal to one, then a
standard CEU functional is obtained.48 Although Shackle does not explicitly set the functional
of the criterion he was proposing, it is straightforward to note that it overlaps with the -MEU
functional.
5. Concluding remarks
Many critics of the mainstream have maintained that the assumption of a subjective prior
probability distribution does not allow a meaningful distinction between risk, or measurable
uncertainty in Knights words, and proper uncertainty, or unmeasurable uncertainty in
Knights words. Shackle, in particular, emphasised that not only knowledge is bounded, but
that the bounds are necessarily imprecise. As recently recalled by Loasby (2000: 5), Shackles
view of the use of probability in decision theory amounts to the point that the imposition of
probability distributions, whether subjected or supposedly objective, on closed sets is a
pretence of knowledge.
It is our contention that the argument developed in this paper and outlined at the end of
the previous section is worth considering for two reasons. First, the paper has dealt with an
issue, the representation of individuals assessment, that is both the main analytical point
underlying Shackles theory and the main analytical reference for writers dealing with the
problem of how to represent decisions under genuine uncertainty. As previously mentioned,
Shackles legacy has been taken up especially by Keynesians, institutionalists, and Austrians.
For instance, in the Austrian tradition (Langlois 1994 and Vaughn 1994) this argument plays
a crucial role, in so far as it serves to distinguish between rational ignorance and radical
ignorance and hence to differentiate Savages approach from Knights approach. In this
regard, this paper has documented that important developments in modern decision theory
take Shackles issue seriously.
48
V ( f ) = max E (u o f ) + (1 ) min E (u o f ) .
26
The second ground for relevance of our argument is that we have shown that the issue
of reliability of probability distributions can be dealt with through a relaxation of the
hypothesis of additivity of probability distributions, contrary to Shackles view that
probability calculus should be dismissed. The use of non-additive measures opens up the
possibility of dealing with situations in which an individual is aware that she misses some
relevant information, and of doing so without abandoning the realm of probability theory, but
conforming to a tradition of thought that interpret probability in a more open fashion than
strict Bayesianism.
27
REFERENCES
Allais, M. (1953), Le comportemenet de lhomme rationel devant le risque: critique des
postulates et axioms de lEcole Americaine, Econometrica, 21: 503-546.
Arrow, K. J. (1951), Alternative approaches to the theory of choice in risk-taking situations,
Econometrica, 19: 404-437.
Arrow, K. J. (1953), Hurwiczs optimality criterion for decision making under ignorance,
Technical Report 6, Stanford University.
Arrow, K. J. and Hahn, F. H. (1999), Notes on sequence economies, transaction costs and
uncertainty, Journal of Economic Theory, 86: 203-218.
Arrow, K. and Hurwicz, L. (1972), Decision making under ignorance, in C. F. Carter and J.
L. Ford (eds.), Uncertainty and Expectations in Economics. Essays in Honour of G. L.
S. Shackle. Oxford: Basil Blackwell.
Barrett, C. R. and Pattanaik, P. K. (1993), Decision making under complete uncertainty, in
S. Sen (ed.), Uncertainty and Risk in Economic Life. Aldershot: Edward Elgar.
Basili, M. (2001), Knightian uncertainty in financial markets: a critical assessment,
Economic Notes.
Binmore, K. and Brandenburger, A. (1990), Common knowledge and game theory, in K.
Binmore, Essays on the Foundations of Game Theory. Oxford: Blackwell.
Bowman, M. J., ed. (1958), Expectations, Uncertainty and Business Behaviour. New York:
Social Science Research Council.
Camerer, C. and Weber, M. (1992), Recent developments in modelling preferences:
uncertainty and ambiguity, Journal of Risk and Uncertainty, 5: 325-370.
Camerer, C. (1995), Individual decision making, in J. Kagel and A. E. Roth (eds.),
Handbook of Experimental Economics. Princeton: Princeton University Press.
Carabelli, A. (1988), On Keyness Method. London, Macmillan.
Carabelli, A. (2002), Speculation and reasonableness: a non-Bayesian theory of rationality,
in S. Dow and J. Hillard (eds.), Keynes, uncertainty and the Global Economy.
Aldershot: Edward Elgar.
Chateauneuf, A. (1991), On the use of capacities in modelling uncertainty aversion and risk
aversion, Journal of Mathematical Economics, 20: 343-369.
Choquet, G. (1954), Theorie des capacites, Annales de lInstitut Fourier, 5: 131-233.
Dempster, A. P. (1967), Upper and lower probabilities induced by a multivalued mapping,
Annals of Mathematical Statistics, 38: 325-339.
de Finetti, B. (1937), Foresight: its logical laws, its subjective sources, in H. E. Kyburg and
H. E. Smokler (eds.), Studies in Subjective Probability. New York: Wiley, 1964.
de Finetti, B. (1974), Theory of Probability. New York: Wiley.
Dow, J. and Werlang, S. R. da Costa (1992), Uncertainty aversion, risk aversion, and the
optimal choice of portfolio, Econometrica, 60: 197-204.
Earl, P. (1983), The Economic Imagination. Towards a Behavioural Analysis of Choice.
Brighton: Harvester.
Earl, P. E. and Frowen, S. F. (2000), Introduction, in P. E. Earl and S. F. Frowen, eds.,
Economics as an Art of Thought. Essays in memory of G. L. S. Shackle. London:
Routledge.
Edwards, W. (1958), Note on Potential Surprise and Nonadditive Subjective Probability, in
M. J. Bowman (ed.), Expectations, Uncertainty and Business Behaviour. New York:
Social Science Research Council.
Eichberger, J. and Kelsey, D. (2000), Non-additive beliefs and strategic equilibria, Games
and Economic Behaviour, 30: 183-215.
28
29
30
31