Professional Documents
Culture Documents
Test 1
Test 1
1
Decisions Matter
■ Betting on a "Sure Thing"
Bernard Madoff, former NASDAQ
chairman and respected
financier and philanthropist,
claimed to offer clients modest
but surprisingly steady returns
➔
Attractive to many, especially
non-profit organizations
In 2009, admitted that his entire
business was a scam
➔
Wall Street Journal estimated
total loss of over $26 billion
dollars to hundreds of clients
➔
Scam was a "classic," but went
uncaught for almost 20 years 2
Decisions Matter
■ Extreme Management
In late 1911, Robert F. Scott leads a
five person expedition to the south
pole (racing against Roald
Amundsen)
Numerous questionable command
decisions by Scott
➔
Reliance on ponies and experimental
motorized sledges
➔
Disdain of dogs (thought unreliable and
“ignoble”)
➔
Substantially insufficient food/resources
➔
Unwillingness to listen to others
All five men perished. A supply depot
was 11 miles away. 3
The COVID-19 Pandemic
■ Weighing the odds
COVID-19 pandemic forced businesses
and governments around the world to
weigh the risks of infection against the
costs of shutting down businesses
➔
Unstopped, could have resulted in apx
3 million deaths in US (based on
Thomas et al., 2020, Verity et al., 2020),
large numbers w/chronic injuries;
deaths alone imply economic cost of
over $21 trillion (@$7 million per, Cutler
and Summers (2020))
➔
US economic costs from shutdowns/interventions (not all
policy-driven) estimated at $3-5 trillion (Walmsley et al., 2020)
■ Poor
decision making costs time,
money, jobs, and even lives
■ In
business contexts, decisions can
make the difference between
success and failure
■ Doingthings “right” means knowing
how – and why – we so often do
things “wrong”
5
A Descriptive Lesson: Predicting Decisions
1 4 3 2
7
Example: Value at Risk and Bank Capital Requirements
■ In making lending decisions, banks must weigh value of
interest against risk carried
Beliefs: estimates of default probability, opportunity costs, future
capital availability, estimates of capital demand
Values: risk/return trade-off, discount rate
Choice set: loan terms (exposure, maturity, interest rate, etc.)
States of the world: possible income streams (including potential
for firm failure)
■ Likewise, those investing in banks must consider
exposure
■ Regulating exposure: capital requirements and the Basel
Accords
Agreements determine how risk is assessed, communicated
Rules for rating risks set context for future investment decisions
8
Domains in the Study of Decision Making
■ Judgment
What we believe, how we learn?
■ Choice
What do we value, how to we choose?
■ Strategic Behavior
How do we respond to other decision
makers?
9
Decisions in Business Contexts
■ Accounting/forecasting
■ Consumer behavior
■ Human resources
■ Task performance
■ Managerial decision making/strategy
10
Class Overview
■ Three Components:
Part 1: Models of Rational and
Irrational Decision Making Behavior
Part 2: Pathologies in Individual
Decision Making
Part 3: Decision Making in Social
Contexts
■ Approximately
nine lectures, one
exam per component
11
Requirements and Grading
■ Assignments
Lecture
Reading
Discussion sections
■ Evaluation
Three exams [3x30% = 90%]
Decision making exercises [4x2.5% =
10%]
■ See syllabus and web site for timing
information and related policies 12
Discussion Sections
■ TAs
Iuliia Puchko Wilson (ipuchkow@uci.edu)
Alexander Murray-Watters (amurrayw@uci.edu)
Camille Samuels (csamuel1@uci.edu
Roscigno Saverio (sroscign@uci.edu)
Activities
Review/discuss lecture and readings
Answer additional questions from lecture
Prepare for exams
13
Final Notes
■ Remember to check the web site for
syllabus and updates
(https://canvas.eee.uci.edu/courses/40401)
■ Use business.decisions.logistics
@gmail.com if you encounter technical
issues
■ Discussion sections start in week 2
Sections are held in person at their assigned
times and rooms
Contact your TA if you have section-related
questions 14
Rational Decision Making: Myths
and Confusions
Business Decisions
SOC 138/ECO 148
1
Back to Decisions
■ Decision Maker
Beliefs
Values
■ Context
Choice Set A B
States of the 50
%
50
%
90
%
10
%
World
1 4 3 2
2
What Do We Mean by a “Rational” Decision?
5
Myth: Rationality is “Cold” or “Emotionless”
9
From Myth to Reality
■ Rational
decision making stresses
consistency with values and beliefs
■ All
decisions or questions of
judgment are “fair game”
■ Context, subject matter, or
emotional states do not render
decisions intrinsically rational or
irrational
10
Confusion: Normative and Descriptive Models
11
Example: Planning, Policy, Measurement, and Forecasting
■ Normative or Descriptive?
Basel Capital Accord
Annual sales forecast
S&P 500 index
Mission statement
Investment plan
Sharpe ratio
■ How could each of these be used in
an alternate role?
12
Avoiding Confusion
■ Determine if a given use if normative
or descriptive
Are we making prescriptions, or
predictions?
■ Keeping our “is” out of our “ought”
We do not have to view real behavior
as optimal or desirable
Our ideas of optimal or desirable
behavior do not have to reflect
common practice
13
Summary and Prospect
■ Folk notions of "rational" behavior differ
from modern rational choice theory
■ Principles of rational decision making can
be used either descriptively or
prescriptively
Both are important, but shouldn't be confused!
■ Up next: the theory itself (in three parts)
Axioms of rational choice
Principles of rational judgment
Game theory
14
The Axioms of Rational Choice
Business Decisions
SOC 138/ECO 148
1
Reminder: Goals of a Rational Decision System
2
An Axiomatic Development
■ “Axiom”: a property or constraint which must
be satisfied
■ Axiomatic development (indirect definition)
Pick constraints (axioms) which are necessary
conditions for rational behavior
Find a system which satisfies those constraints (if
any)
Behavior generated by such a system will then
satisfy the initial constraints – it implements the
axioms
■ Applied to choice by Von Neumann and
Morganstern, Savage, and others
3
Notation and Basic Concepts
■ Alternatives
S signifies the set of alternatives, with outcomes A,
B, etc.
■ The preference relation
AB signifies “A is preferred or indifferent to B,” or
“B is not preferred to A”
A=B signifies indifference between A and B
■ The lottery
(ApB) signifies a gamble (or “lottery”) in which A
occurs with probability p, and B occurs with
probability 1-p
p ? 1-p
(ApB) A B 4
Axiom 1: Comparability
■ Given alternatives A and B, either AB, AB, or
both, in which case A=B
■ Interpretation: All pairs of options can be
compared; you either prefer one to the other,
or else you are indifferent between them
■ Pathology avoided: Inability to make a
prediction or judgment of any kind (even of
indifference) for some sets
5
Axiom 2: Transitivity
■ If AB and BC, then AC
■ Interpretation: Preferences can’t have
“loops”; values must form a hierarchy
■ Pathology avoided: The “money pump”;
also, manipulation via agenda setting
(the Condorcet problem)
6
The "Money Pump": A How-To
Imagine that "Bob" has preferences A>B, B>C, but C>A...
Here's how to take all of his money:
8
Axiom 4: Distribution of Probability Across Alternatives
(Frame Invariance)
■ (ApB)qB = (ApqB)
■ Interpretation: Only the “final” probability
distribution over states of the world matters –
re-expressing a choice in terms of compound
lotteries will not change your decision
■ Pathology avoided: Manipulation by framing
effects
q 1-q
=
pq 1-pq
p 1-p
A B B A B
pq 1-pq pq 1-pq 9
Axiom 5: Independence
■ For options A,B,C, AB if and only if (ApC)(BpC)
■ Interpretation: Adding an equal chance of getting
an irrelevant third outcome instead of either A and
B shouldn’t change your decision
■ Pathology avoided: Manipulation by framing effects
(e.g., pseudocertainty), “(weak) stochastic money
pump”
10
Taking More of "Bob's" Money
Imagine that "Bob" has preferences (ApC)>(BpC), but B>A...
Here's how to take more of his money:
12
Yet More Money from "Bob"
Imagine that "Bob" has preferences (ApB)>A>B...
Here's how to take whatever money "Bob"
has left after the previous axioms:
1. Give him A (you have B).
2. Offer to give him a chance to win B from you
(i.e., (ApB)) in exchange for A and a small fee.
3. Let the lottery resolve; if he ends up with A (you
have B), return to (2).
4. If instead he ends up with B (you have A), offer
to trade it for your A and a small fee.
5. Go back to step (2) and take more of his money!
Repeat as needed.
(This is like the first money pump, but it uses a random ("stochastic")
process; we call it "strong" because it can be repeated.)
13
Axiom 7: Solvability
■ If ABC, there exists a probability p such that
B=(ApC)
■ Interpretation: If you prefer A to B to C, there is
some lottery over A and C which is equivalent
to B
■ Pathology avoided: “Obsession with the
infinitely improbable”; indirect violation of
completeness (e.g., if two alternatives are
unbounded)
14
The Form of Expected Utility Theory
■ A model which satisfies the axioms: Subjective
Expected Utility Theory (or just EUT)
■ Model structure
Utility function: an abstract numerical measure of
preference
Subjective probabilities: beliefs about states of the
world, given choices
■ Decision procedure
For each option, take the (probability) weighted
average of outcome utilities
Choose the option with the highest expected (I.e.,
weighted mean) utility
15
Example of EUT in Action
40%
16
A Geometric View of Risk Aversion
7
6
5
u(x) 4
3
2
1
0
$0 $5 $10 $15 $20 $25 $30 $35 $40
x
17
Another Example – the WhirliLoop Dilemma
Invest in new infrastructure, or not?
• Current facility results in $200,000 profit per quarter
• New facility costs $100,000
• If demand for WhirliLoops spikes (30% chance), new facility would
allow for estimated quarterly profit of $600,000
• Without new facility, could only obtain $300,000
• Only interested in next quarter (you plan on retiring!)
Maintain Invest
30% 70% 30% 70%
18
Generality of the Framework
■ A surprising result: any model satisfying the
axioms will be equivalent to EUT
In fact, only transitivity, completeness, continuity,
and independence are required
Further, the utility function itself can be specified up
to a positive affine transformation
■ Implications:
EUT can be considered a “generic” or canonical
model of rational choice (as defined by the axioms)
Any alternative model must lead to one or more
pathologies of choice
19
Still Ahead….
■ The judgment side: probability, and
rational judgment
■ Strategic decision making
■ Descriptive models
20
Rational Judgment
Business Decisions
SOC 138/ECO 148
1
Manifesto for Rational Judgment
■ Judgments must be logically
consistent
■ If a conclusion can be reached in
more than one way, then every
possible way must lead to the same
result
■ We must always take into account all
of the evidence at our disposal
■ The same evidence must always
lead to the same judgment
2
Reasoning About Uncertainty
■ Long understood to be of importance
“Probability is the very guide to life.” –J. Butler,
1896
■ Deeply connected with “common
sense”…
“Probability theory is nothing but common
sense reduced to calculation.” --P. Laplace,
1819
■ …but, as we shall see, Monsieur Laplace
exaggerates
3
From Mystery to Probability
■ Rules of evidence
Roman and Talmudic witness laws
Medieval “half-proof”
■ Early insurance pricing
Athenian “Maritime Loans”, c340BCE
Ulpian’s annuity tables
■ Games of chance
Geralamo Cardano (1540s) – inveterate
gambler
Le Chevalier de Mere (1650s) – first
mathematical theory of dice (followed by
Fermat, Pascal, and others)
4
What is Probability?
■ Probability
is a mathematical
representation of uncertainty
Probability theory is a formal system
for reasoning about uncertainty
■ Various interpretations
Classical
Frequentist
Subjective
5
Basic Concepts and Notation
■ The“universe of possibilities”
(denoted S)
Events (denoted A, B, C, etc.)
Negation (denoted A)
The null event (= S)
Conjunction (AB) and disjunction (A or
B)
Conditioning (A|B)
■ The probability measure (denoted p)
p(A) is the probability of A
6
The “Rules” of Probability
■ 0p(A)1 for all events A in S
Probabilities are real numbers between 0
and 1
■ p(S)=1, p()=0
Something must happen, and nothing can’t
happen
■ p(AB)=0 iff p(A or B)=p(A)+p(B)
If A and B are mutually exclusive, the
probability that at least one happens is the
sum of their probabilities
7
More “Rules”
■ p(A)+p(A)=1
Something either happens, or it
doesn’t
(Can derive from previous rules)
■ p(AB)=p(A|B)p(B)
The probability of A and B is the same
as the probability of B times the
probability of A given B
8
Bayes’ Theorem
■ The most important result in the
theory of inference!
■ p(B|A)=p(A|B)p(B)/p(A)
The probability of B given A is “share”
of the probability of A taken up by the
probability of A and B together
9
Why the Excitement?
■ Bayes’ Theorem allows us to derive
probabilities of hypotheses given
data from probabilities of data given
hypotheses
■ For hypothesis, H, and data, D:
p D∣H p H
p H∣D=
p D∣H p H p D∣¬H p¬H
10
Interpreting the Theorem
■ Each component of Bayes’ Theorem
has a specific meaning:
Probability of Probability of
data given H H before data
p D∣H p H
p H∣D=
p D∣H p H p D∣¬H p¬H
12
Example #1: Detecting a Cheat
■ Problem: p(Coin is two-headed)
Assume fair/2H are equiprobable
Three flips yield 3 heads in a row
■ Solution:
p H H H∣CpC
p C∣H H H =
p H H H∣Cp Cp H H H∣¬Cp ¬C
10.5
=
10.50.1250.5
≈88.9%
13
Example #2: Bankruptcy
■ Problem: p(Mr. X defaults)
Assume a base default rate of 0.01
Mr. X is observed to have a monthly obligation of $X
60% of defaulters have a monthly obligation of $X,
versus only 10% of non-defaulters
■ Solution:
pO X∣D pD
pD∣OX =
pOX∣DpDpO X∣¬D p¬D
0.60.01
=
0.60.010.10.99
≈5.7%
14
Summary
■ Probability theory supplies a basis for rational
judgment
■ Bayes’ Theorem tells us how to combine past
experience (priors) and new evidence
(likelihood) to reach conclusions (posteriors)
Recite it to yourself every day; it builds character
15
Decision Making in Strategic
Contexts
Business Decisions
SOC 138/ECO 148
1
Life is But a Game....
■ Thus far, we have been concerned with choices
and judgments
“Decision” as central concept
■ “Games”: formal models of strategic situations
Basic Elements
➔
Players
➔
Actions
➔
Strategies
➔
Payoffs/outcomes
Complications
➔
“Chance” as a player
➔
Information sets
2
Game Theory and Strategic Decision Making
4
Simultaneous Games
■ Games in which
players choose
without observing I.
others’ choices Cheat Not
Not observing others
doesn’t mean ignoring -9 -1
them, however Report
Represented in matrix 1 -1
form II.
9 0
■ Various solution
Not
concepts
Dominant strategies
0 0
Nash equilibrium
5
Dominant Strategies
■ Strategies which are at least as good as
any other strategy, no matter what one’s
opponents do
Weakly dominant: at least as good
Strongly dominant: strictly better
■ Should always be used, where possible
Where available, can ignore opponent’s
choices
Dominant strategies do not always exist
Procedure: deletion of dominated strategies
6
Example – Overzealous Enforcement
I.
Cheat Not
-9 -1
Report
5 1
II.
9 0
Not
0 0
7
Best Responses
■ Given opponents’ strategy choices, the
best response is one which maximizes
expected payoff
Contingent on opponents’ strategies
Dominant strategies are best responses, if they exist
■ Best response may be a mixed strategy
Do one thing sometimes, sometimes another
Choose randomly, given optimal weights
“Clamor in the East, attack in the West” –Stratagem 6
8
Example – Adoption of New Technologies
DevelCo
C++ Java
1 -1
C++
1 -1
SupportCo
-1 2
Java
-1 2
9
Equilibria
■ Multiple decision makers create the
problem of strategic reasoning
My behavior depends on your behavior which
depends on my behavior which depends on your behavior which depends on my behavior………..
➔
“In war, the will is directed at an animate object
that reacts” -Clausewitz
Seek sets of strategies which are “stable”
➔
Infinite regress should not change behavior
➔
Rational players should not have incentives to
deviate
■ Strategy sets which are stable in this way
are called “equilibria”
Many kinds – we consider only two here
10
Equilibrium in Dominant Strategies
■ Where dominant strategies exist, they
should be used…
■ …so, if each player has a dominant
strategy, this provides a stable strategy set
■ Equilibrium in dominant strategies
Usually unique, but may not be (if multiple weakly
dominant strategies exist)
Not often found
Very stable – not easily perturbed
Always subgame perfect
11
Hypothetical Example - OPEC
■ Cartel problem
Kuwaiti
Increased production Prod.
decreases price/barrel
Temptation to expand
1 2
■ Dominant strategies 3 5
Kuwait: produce more
8
Saudi 24 20
SA: produce less
Prod.
■ Result: Kuwait 2.5 4.4
expands, SA holds 9
back 22.5 19.8
Why? SA has larger
share!
(Production in mB/d, Payoffs in 10m$/d)
12
Nash Equilibrium
■ Where equilibria in dominant strategies
fail, what other options are there?
Look for sets of mutual best responses
■ Nash equilibrium
Many may exist
May be in pure or mixed strategies
Can be unstable – problem of equilibrium
selection
Dominant and subgame perfect equilibria are
Nash, but the reverse is not always true
13
Hypothetical Example – Tobacco Advertising
■ Advertising problem
Advertising is
expensive Philip Morris Adv.
Can draw customers L H
from a competitor
■ Nash equilibria 1.2 1.4
One stands firm, one L
yields Ligget 0.8 0.7
Mixing it up Adv.
■ Result: unstable 1.1 0.9
situation H
Multiple equilibria 1 0.6
Expect frequent
changes in strategy (Payoffs in 1b$/y)
choice
14
A Heuristic Summary
■ Assess the game
Determine players, actions, payoffs
Determine form (sequential or simultaneous)
■ D&N’s Heuristics
“Look forward, reason backward”
“If you’ve got a dominant strategy, use it”
“Eliminate dominated strategies from consideration”
“Look for an equilibrium”
■ Final inspiration
“Study strategy over the years and achieve the spirit
of the warrior. Today is victory over yourself of
yesterday; tomorrow is your victory over lesser men.”
–Musashi
15
Bounded Rationality, Search,
and Satisficing
Business Decisions
SOC 138/ECO 148
1
The Frailty of the Flesh
■ Rational choice implies substantial cognitive
capacity
Must consider all possible options
Must consider all possible outcomes
Must weight probabilities/values perfectly for all of
the above
In strategic contexts, must model opponents’
behavior, too!
■ Perhaps we ask too much?
Limits on memory, cognitive processing capacity
Cannot conceive of all possibilities, must less
compare them optimally
2
Bounded Rationality
■ An alternative to perfect rationality:
assume decision makers are rational
within specified limits
Primarily descriptive, although some
normative uses
■ Many kinds of limits can be considered
Memory
Forward-thinking (myopia)
Strategic thinking
3
Heuristics
■ Heuristic: a general guideline or “rule of
thumb” which usually works well (but
which is not optimal)
■ How well each rule works depends on the
environment
Particular heuristics are better for particular situations
Can be especially bad in the wrong context
4
Examples
■ Measuring length via thumb, forearm,
paces
■ Avoiding odd-looking people in dark
alleys
■ Buying from established firms (branding)
■ Hiring based on recommendation
■ “Folk” diversification rules (e.g., always
have 5 stocks, half stock/half bond mix)
■ Investing based on personal experience
5
Heuristic Search
■ A metaphor: “All the world’s
a supermarket”
Many options, with multiple
features of interest
➔
“Search space”
Can only see a few at a time
Must search through the
options for the one which is
most preferred
➔
“Search strategies”
■ Want to understand the
implications of different
search strategies for choice
6
The Conjunctive Rule
■ Procedure:
Choose a set of requirements each option must meet
Reject any option which fails one or more requirements
Pick (possibly randomly) from the remainder
■ Properties:
Close to optimal if filtering conditions are stringent
enough
Not very useful if conditions are too loose
Can get “stuck” if no option satisfies conditions
Related to the principle of satisficing (which we will see
presently)
7
The Disjunctive Rule
■ Procedure:
Choose a set of requirements each option must
meet
Reject any option which fails all of the requirements
Pick (possibly randomly) from the remainder
■ Properties:
Close to optimal if options are similar except for one
feature (and filter is tough enough)
Often over-inclusive
Can easily lead to acceptance of skewed options
8
Lexicographic Search/Elimination by Aspects
■ Procedure:
Rate aspects by importance
Compare/eliminate options by most important
aspect (for EBA, pick aspect semi-randomly)
If still not through, go to the next aspect
■ Properties:
Close to optimal if one aspect is vastly more
important than all others
Can get into trouble if strong trade-offs exist
9
Example – Portfolio Selection
Portfolio A Portfolio B Portfolio C
Expected Return: $1500 Expected Return: $900 Expected Return: $3500
Sharpe Ratio: 1.5 Sharpe Ratio: 3 Sharpe Ratio: 0.75
Social: 2 out of 5 Social: 4 out of 5 Social: 1 out of 5
10
Satisficing
■ Very general heuristic: search until you find
something which is “good enough,” then stop
■ Strength – relatively undemanding
Need definition of “good enough,” ability to check
options against the definition
■ Weakness – you may miss out on a good thing
■ Close to optimal when search costs are high,
and/or when most “acceptable” options are of
similar value
11
Example – Hiring by Interview
1 2 3 4 5 6
12
Win-Stay Lose-Shift
■ A simple satisficing heuristic for strategic play
When you are winning, keep the same strategy
When you are losing, try something else
■ Requires minimal processing power, works
moderately well in many situations
Don’t tend to keep doing things which don’t work, tend
to spend most of your time with strategies which (more
or less work)
In long run, will avoid dominated strategies, seek
dominant ones
Not so good where “clever” behavior (like mixing) is
required
13
Example – Good/Bad Traders
14
Summary
■ Bounded rationality: a compromise
between genius and stupidity
Heuristics instead of optimal procedures
Search metaphor
■ Different heuristics have different
strengths/weaknesses
Some aren’t bad, but they must be matched
to the environment
Can result in systematic distortion (e.g., eye-
level bias)
15
Loss Aversion and Framing
Effects
Business Decisions
SOC 138/ECO 148
1
A Less Rational Decision Theory
■ With bounded rationality, persons
were taken to be rational within
limits
■ The problem runs deeper, alas
Empirical studies by Allais, Ellsberg,
Kahneman, Tversky, and many others
suggested that people choose in
fundamentally irrational ways
■A new imperative: can we model this
behavior?
2
A Detour Through Psychophysics
■ Study of human response to
physical stimuli
Perception of color,
light/darkness, sound, etc.
■ Stylized fact – perceptions
occur in terms of changes,
rather than absolute values
■ Weber-Fechner law
The “just-noticeable”
difference in any stimulus is
proportional to the magnitude
of the stimulus 3
Example: Noise Levels
Increase for
Same Perceived
2
Volume
1 Change
Base Level
4
Reference Values
■ Psychophysical principle for decision making:
values are assessed in terms of changes,
rather than absolute states of the world
■ Each person has a reference value for their
current state
Decisions are evaluated based on changes from the
reference point
Contrast with EUT, which focuses only on end
states
Results in irrational behavior, as we shall see
5
Diminishing Sensitivity to Gains/Losses
■ Application of Weber-
Fechner law Value
Sensitivity to
gains/losses diminishes
as gain/loss grows
Requires “S-shaped”
value curve Payoff
■ Implies proportional
pricing effects
Example 1: would drive
across town to save $20
on a vacuum cleaner, but
not on a car
Example 2: difference
between losing nothing
and $10 greater than
between losing $10,000
and $10,010
6
Gain/Loss Frames
■ A problem: we can change gains into losses by
changing the problem frame!
The same choice between states of the world can be
put in terms of what you could lose (assuming the
gains) or what you could gain (assuming the losses)
Violates the principle of frame invariance
■ S-curve implication: risk averse when framed
as gain, risk-loving for loss
Sure gain of $240 vs 25% chance of gaining $1000
Sure loss of $750 vs 75% chance of losing $1000
46% chose A,B (about twice as many as the next
most popular option), 59% inconsistent
7
Example – The Layoff Problem
"Due to a revenue shortfall, a CEO faces a dilemma. Firm
analysis say that required plant closings will force the
layoff of 600 workers for certain, unless one of two plans is
chosen. If the CEO chooses the first plan, 200 workers’
jobs can be saved [400 workers’ jobs will be lost]. If the
CEO chooses the second plan, there’s a one-third chance
that 600 workers’ jobs will be saved [no jobs will be lost],
and a two-thirds chance that none will be saved [600
workers’ jobs will be lost]. Which plan do you think the
CEO should choose?"
First (safe) plan: 84% [52%] Second (risky) plan: 16% [48%]
9
Implication: Honoring Sunk Costs
■ When we fail to update our reference values,
we may think of resources already spent as
still being “ours to lose”
Violates future orientation
■ “Completing Tennessee-Tombigbee is not a
waste of taxpayers’ dollars. Terminating the
project at this late stage of development would,
however, represent a serious waste of funds
already invested.” –Sen James Sasser, Nov.
1984
(Too bad the project was estimated to be worth less
than the cost of completion!)
10
Implication: The Endowment Effect
■ Once we are “endowed” with an object, we
update our reference value
Giving it up is now a loss – and the object is now
worth more than it was before!
Kahneman, Knetch, and Thaler’s magically more
marvelous mugs
■ Evidence suggests that the effect grows with
time
No clear endpoint
May explain obsession with old trinkets
11
Example – Negotiations
■ Consider negotiations between Mr. X and Ms. Y
Mr. X has apples, Ms. Y has oranges
■ An obstacle to trade: the endowment effect
That which you trade away is a loss – and hence
more valuable
That which you receive is a gain – and hence
(relatively) less valuable
But this is true for both parties!
■ What can be done?
Distance both sides from their endowments; “put
everything on the table”
Discuss dividing the resources, not direct exchange
Frame the breakdown of talks as a loss
12
Tying it All Together: Prospect Theory
■ Descriptive model Value
Choice among
prospects
[(x1,p1),(x2,p2),...,(xn,pn)] Payoff
■ Three elements
Reference value
Loss averse value
function (v(x))
Probability weighting 1
function ((p))
Decision Weight
■ Choose prospect
which maximizes
n
∑ v x i pi 0
i=1
0 Probability 1
13
Prospect Theory vs. EUT
■ EUT looks to the end state, PT thinks about the
present
Gains and losses may lead to the same place, but
PT cares about how we get there
■ EUT is frame-invariant, unlike PT
Can lead to transitivity, comparability violations
■ PT differentially weights probabilities
Can lead to independence violations
Overweights certain small probabilities,
underweights moderate ones
■ EUT has a normative foundation, PT is
intended to describe real behavior
14
Coming Soon…
■ More on the weighting function
Censoring effects
■ Forecasting and scenario thinking
■ Risk perception
■ And, finally, the dread exam
Advice from prospect theory: endow
yourself with a good grade, and then
consider the chance of losing it!
15
Censoring Effects,
Forecasting, and Scenario
Thinking
Business Decisions
SOC 138/ECO 148
1
Prospect Theory, Redux
■ Descriptive model Value
Choice among
prospects
[(x1,p1),(x2,p2),...,(xn,pn)] Payoff
■ Three elements
Reference value
Loss averse value
function (v(x))
Probability weighting 1
function ((p))
Decision Weight
■ Choose prospect
which maximizes
n
∑ v x i pi 0
i=1
0 Probability 1
2
Prospect Theory and the Decision Weighting Function
■ In EUT, outcomes 1
are weighted by p; in
PT, (p) fills that
role
Decision Weight
■ Some features:
Underweighting of
moderate/large
probabilities
Overweighting of 0
small probabilities
0 Probability 1
Censoring of
extremes
3
Censoring the Extremes
■ For probabilities very 0.2
close to 0/1, weights
are treated as 0/1
Decision Weight
■ Intuition: we cannot
conceive of extremes,
“round” them to
nearest whole number
■ Problem: where
extremes matter, 0
impaired
4
Conjunction Conundrum
■ The intersection (conjunction) of many highly
probable events is difficult to evaluate
For probabilities close to 1, PT rounds up
Even below this point, hard to calculate intuitively
■ Problem: many scenarios involve conjunctions
Plans with multiple elements
Strategic contingencies
Mechanisms with many parts
5
Example – Uptime Problem
"A machine is composed of seven parts. Suppose each part in the
machine works 90% of the time. If the machine works as long as
all seven parts are working, what percent of the time will the
machine be working?"
Correct answer:
6
The Difficulties of Disjunction
■ The probability of an event happening at least
once (disjunction) is hard, too
Disjunction = conjunction of non-events
For probabilities close to 0, PT rounds down
Same problem of intuitive calculation
■ Problem: risk evaluation often involves
disjunction
Chance of event happening within time span
Chance of at least one failure occurring
7
Example – Bankruptcy Risk
"Assume that a firm, StableCo, faces a 0.4% chance of filing for
bankruptcy per month. Over the course of 25 years, what is the
probability that StableCo will file for bankruptcy?"
Correct answer:
p BR in 25y =1− p ¬BR in 25y
25 12 ?!?
=1− ∏ ∏ p ¬BR in 1mo
i =1 j =1
25 12
=1− ∏ ∏ 1− p BR in 1mo
i =1 j =1
=1−1−0.004300
=1−0.996 300
≈70 %
8
Thinking with Stories
■ In considering complex situations, we fall
back on causal narratives (“stories”)
Reflect mental models of phenomenon at
hand
■ Elements of a story
Actors
Events
“Reasons”
■ Stories without these elements typically
seem less plausible
9
The “Plausibility” Bias
■ Flashback from probability: p(AB)≤p(A)
Thus, adding elements to a story cannot make it more
probable (conjunctive fallacy)
■ Thinking with “reasons” and “explanations”
More reasons/explanations provide a clearer picture
of how the story might come about
Conjunctive reasoning is difficult, makes complex
stories seem more probable
■ “Plausibility” bias
Stories with more elements (particularly elements
which are individually probable) are seen as more
probable
10
Example – PittSteel
"PittSteel is a rust-belt steel company which has fallen on hard times. 87%
of similar firms in the region have filed for bankruptcy in the last five years,
and two of the firm's five largest customers have recently shifted to
alternative suppliers. After a recent earnings report, PittSteel's credit was
downgraded, and its stock price has fallen by more than 40% over the past
three months."
• PittSteel will implement massive job cuts within the next four
quarters, leading temporarily to record profits, but will be forced
into Chapter 11 bankruptcy protection when revenues continue
to slump. [92%]
(p(profits)≥p(profits AND job cuts AND bankruptcy), but it doesn't
always seem that way....)
11
Further Problems from Hindsight
■ Coherent stories are easier to remember
than complex collections of events
An incoherent set of circumstances may be
forgotten; cases which fit our sense of
“plausibility” more likely to be remembered
“Profusion of special cases”
➔
Each previous account seems to be unique,
generalization difficult
■ “Creeping determinism”
Memory biased in direction of current knowledge
Past events seem more certain in retrospect
12
Implications for Forecasting
■ Systematic biases in scenario evaluation
Risks of event occurrence will be underestimated
Probability of plan success will be overestimated
Scenarios which make “good stories” will be seen
as more probable (even when they aren’t)
■ Difficulties in learning
Past uncertainties will seem reduced
Present knowledge will color recollection
The “this time will be different” effect
13
Putting it All Together
■ Reasoning about “chains” of events is hard
Censoring of extremes
Difficulty of intuiting px
■ Instead, we think with stories
Elements linked by “explanations”
Plausibility related to the strength, number of causal
explanations
■ Produces problems for realistic forecasting
■ Next time: risk perception
14
Risk Perception
Business Decisions
SOC 138/ECO 148
1
What is “Risk”?
■ Conventionally: a chance of a negative
outcome
I am “at risk” if something bad might happen to me
■ More formally: uncertainty in payoffs
I am “at risk” if I’m not sure what will happen to me
■ Contrast the notions by considering sure
losses and unsure gains
Sure losses are “risky” by lay usage, and unsure
gains are not
Sure losses have no risk in the formal sense, and
unsure gains are risky
2
Objective Factors
■ For a rational decision maker, risk is a function
of payoff uncertainty
■ While uncertainty is “subjective” in a strict
sense, it depends upon objective information
Set of potential outcomes
Base rates in the relevant population
Supplemental knowledge regarding problem
■ Rational perception of risk must also follow
rules of rational judgment, EUT
Update probabilities using Bayes’ Theorem
Weight outcomes by the axioms
3
Example – Portfolio Selection
■ Standard financial risk measure: standard
deviation
n
StdDev x=
1
∑
n−1 i=1
x x
i
−
2
4
Risk Perception
■ Risk as a lay concept
Not simply variance in payoffs
■ Affected by cognitive, environmental
factors
Differential reporting of certain events
Differential saliency, ability to recall from
memory
Loss aversion
Decision weights and censoring
5
Example: Perceived Death Rates
■ Death rates provide a basic background set
of risks for a population
■ It is difficult to weight risks appropriately if
you do not even know the rates
■ Knowledge of death rates heavily skewed
Differential media coverage (“newsworthy” death)
Differential visibility (“silent” death)
Differential saliency (“representative” death)
➔
Coherent causal accounts, low variability in details
Confusion of morbidity with mortality (“non”
death)
6
7
8
The Slovic et al. Three Factor Model
■ An alternative to variance: descriptive model of
risk perception based on event features
Subjects rate risks
Ratings analyzed using factor analysis
■ Risk perception of non-experts employs three
distinct concepts
Not simply an approximation to variance
Combination of factor scores determines overall
perceived risk
9
Factor 1: “Dread”
■ Most powerful latent dimension: “dread risk”
“Perceived lack of control, dread, catastrophic
potential, fatal consequences, and the inequitable
distribution of risks and benefits” (Slovic)
■ Events which are dramatic, and which we
cannot control, are seen as riskier
Airplane crashes, terrorism, food poisoning
■ “Mundane” events are less likely to generate
dread, even if they kill more people
불안감, 공포
Tobacco, car wrecks, falls and other household
accidents
10
Factor 2: “Knowability”
■ Next latent dimension: “unknowable risk:
“Judged to be unobservable, unknown, new, and
delayed in their manifestation of harm” (Slovic)
■ Events which are thought to be poorly
understood are seen as riskier
Carcinogen scares, “psycho kids” and spree
shooters, genetic engineering
■ Events which are seen as well-known seem
safer
Coal plants, x-rays
11
Factor 3: “Scale”
■ Third dimension: scale or exposure
Total number of persons affected by or exposed to
the risk
■ Something which affects only a small minority
is perceived as less risky
AIDS during the early/mid 1980s
■ Threats which are omnipresent are seen as
more hazardous
Radon, alar scares
12
Examples
■ Nuclear power: rated high dread,
unknowable/controllable, potentially affects
many people
■ Tobacco: rated low dread, easily controllable,
potentially affects many people
Compare concern over second-hand smoke with
first-hand smoke
■ X-rays: rated low dread, unknown, affects few
people
13
More Examples
14
Interpreting Events – Preexisting Views and Risk Updating
15
Example
■ Plous showed various (real) near-misses to
subjects with established pro/anti military leanings
US nuclear missile launch initiated by accident
Russian missile accidentally re-targeted to Alaska
Phony US Coast Guard advisory claiming that WWIII had
been declared
Russian naval alert falsely claiming war with the US
■ Elicited interpretations, implications
■ Responses followed preexisting biases
Pro-military subjects felt reassured, saw cases as
proving that the system worked
Anti-military subjects felt alarmed, saw cases as proving
that the system is prone to failure
16
Summary
■ Normatively, risk is a shorthand for payoff uncertainty
■ Perceptions of risk do not follow normative models
We fear the catastrophic, the mysterious, and the omnipresent
■ Understanding of objective factors relating to risk is
also distorted
Biased information leads to biased estimation
■ Our interpretation of risk-related events depends on
our base risk perception
Tend to interpret in a based on initial beliefs
Circularly use interpretation as evidence
■ Speaking of risk…Exam 1 approaches! Don’t dread it!
17