Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 41

Decision-Making, Bias, and Influence

Clarissa Cortland, PhD


Some Review
1) T/F: On average, individual pay incentives
lead to decreases in productivity,
satisfaction, and motivation. F
2) Match the motivation theory to the
term/phrase that best describes it:
Some Review
A. Expectancy
a. Perceptions of fairness
Theory

b. In the absence of
B. Goal-Setting
something, we seek it out
Theory

c. The power of beliefs


C. Equity Theory

d. What motivates us is not


D. Needs Theory
necessarily the same as what
demotivates us
E. Two-Factor
Theory
Some Review
3) I believe that my morning OB class is smarter
than my afternoon class. When grades are
finalized at the end of the term, I find that my
morning class on average performed better than
my afternoon class. What is this called?
a. overjustification effect
b. stereotype threat
c. expectation theory of confidence
d. self-fulfilling prophecy
e. overconfidence
Decision-Making, Bias, and Influence
Decision-Making & Biases
A Rational Analysis Model of Decision-Making

Identify Articulate Weight options Pick highest


criteria options by criteria scoring option

Which Masters programme should I attend?

Decision
Weights .3 .2 .1 .4

School Ranking Cost Location Faculty Total


UCL SOM
School 2
School 3
Limits of the Rational Analysis Model

Identify Articulate Weight options Pick highest


criteria options by criteria scoring option

Bounded Rationality Model of Decision-Making:

• Bounded by uncertainty and ambiguity


• Rational analysis model requires a lot of information, and
we often don’t have access to it all.
• Bounded by imperfections of the mind
• Limited by the systematic biases that taint the way we
gather, process, and evaluate information
• Instead of choosing the best option, we accept the
first alternative that meets the minimum threshold.
Simon, 1957
Decision-Making Biases
• A cognitive shortcut
• “Experts are usually correct”
• “Price signals quality”
• “More is better”

• Can be adaptive and functional


• Used to simplify complex
information

• But often over-applied


• This leads to decision-making
errors, or biases
Bias #1: Confirmation Bias
• Tendency to seek only confirming evidence,
while failing to seek disconfirming evidence.
• We see what we expect to see.

• Biases how people:


• search for information
• interpret information
• remember information

Kahneman, Tversky, & Slovic, 1982


Confirmation Bias: A Demo

• The series of numbers below follows a certain rule:

• Guess the rule: Give me a series of three numbers


and I’ll tell you if they follow the rule or not.
• You can guess the rule at any time.
Heath et al., 1998
Confirmation Bias: Potential Solutions

• Seek disconfirming
information.
• Bring in an outsider.

• Ask diagnostic rather


than confirming
questions.

Heath et al., 1998


Bias #2: Anchoring
• What percentage of countries in the UN are
African?
• Each participant spun a roulette wheel and
received either a 10 or 65
• Is the % higher or lower than their roulette #?
50 45%
45%
% of countries in UN that

These arbitrary 40
values substantially
are African

30 25%
impacted estimates, 20
even when people 10
were paid to be 0
more accurate! Roulette = 10 Roulette = 65
Tversky & Kahneman, 1974
Bias #2: Anchoring
• People fixate on an initial anchor and then
adjust from it
• Estimates made in the presence of an anchor are
too close to that anchor
• Adjustments are often insufficient; we typically
consider reasons why the anchor is reasonable
• Effects often beyond conscious awareness
• Warning: expertise is no cure!
• Permeates decisions even when they are
completely irrelevant
Anchoring: Potential Defenses

• Awareness
• Collect many different
estimates
• Don’t let a single estimate
have an outsized impact on
your decision-making.
• Take a step back and
re-anchor
Bias #3: Overconfidence
• Unwarranted faith in our own perceptions and
judgments
• People consistently overrate their own abilities,
knowledge, and skill
• Better-Than-Average Effect
• Dunning-Kruger Effect: people who are the least
knowledgeable are the most overconfident
• Planning fallacy: people tend to underestimate how
long they will need to complete a task and how much
it will cost
Svenson, 1981; Buehler, Griffin, & Ross, 1994; Alicke & Govorun, 2005; Dunning, 2011; Zell et al., 2020
Overconfidence: Potential Solutions

• Awareness
• Keep objective records
• Recognize that more information is not
necessarily better information
• (confirmation bias, anyone?!)
• Seek disconfirming
information
Bias #4: Availability
• A tendency to base judgments on readily
available information
• People use ease of recall as a cue for likelihood
• Affected by vividness, primacy, or recency
• Frequent events are easier to
recall than rare events
• Easily recalled cases unduly
influence our judgments
• Awareness and training are key
Kahneman et al., 1982; Risen & Critcher, 2011
Bias #5: Escalation of Commitment
• Also known as the sunk cost fallacy
• Sunk costs often influence our decision to continue with a
failing course of action
• Escalation of commitment:
• Resources committed to initial course of action
• Does not produce desired return
• Commit more resources to “turn things around”
• Cost of failure increases
• People tend to allocate more money to failing
projects & divisions than successful ones – esp.
when they are personally responsible for the original
investment decision
Staw & Ross, 1989
Reducing Escalation of Commitment

• Awareness
• Focus on the future, not the past
• Separate initial decision-makers from
decision evaluators
• Hold people accountable for decision
processes, not outcomes
Bias #6: Loss Aversion
• Strong preference for avoiding losses over
acquiring gains
• Some studies suggest that losses are twice
as psychologically powerful as gains!
• e.g., losing £50 vs. gaining £50
• praise vs. criticism
• people overwhelmingly go with the default
• Because of loss aversion, people are more
risk-seeking when faced with a loss

Kahneman & Tversky, 1979


Loss Aversion Case Study: Stocks
• Do investors hold losing stocks longer than
winning stocks?
• Tracked 10k brokerage accounts from 1987-1993, incl.
162,948 trades
• In any one year...
• What share of losing stocks were sold?  9.8%
• What share of gaining stocks were sold?  14.8%
• Average 252-day gain after...
• winners sold  +2.35% (better than market)
• losing stocks held  -1.06% (worse than market)

Odean, 1998
Endowment Effect
• People place substantially more value on
things they own
• The loss of an object is perceived as more
negative than the gain of the same object is
positive
• Acquiring something is seen as a gain, giving
it up as a loss
• Reframing the same option as a loss
changes our decisions!

Carmon & Ariely, 2000


Solutions to Loss Aversion

• Awareness

• Try reframing to highlight any potential gain


of a transaction

• Put loss into perspective


• Ask what the worst
possible outcome could be
Case Study: Gourmet Food
Marketing consultant William Chan’s new client is a gourmet food company that
has a long history in the community and a widely recognized local brand. Its
products are available at area delis, higher-end groceries, and some restaurants.

Before hiring William’s firm, the company had introduced a new line of frozen
foods that wasn’t selling well. William was convinced that marketing could help,
and managed to persuade the client to let him try.

First, William tried a series of magazine ads. When this approach failed and the
money was spent and gone, he decided to negotiate with grocery stores for more
visible product placement. By offering stores monetary incentives, he was
eventually successful – but this strategy also failed to boost sales. The client asked
for a formal recommendation about how to proceed.

Instead of reassessing his original plan, William’s report proposed a series of


expensive TV ads. But the client had had enough, and ended the relationship.
Decision-Making Checklist

• Be aware that biases exist.


• Allow enough time to evaluate information
thoroughly.
• Focus on situations and contexts (not just
individuals).
• Focus on broad samples of data (not just
single experiences).
• Present information in multiple ways –
avoid framing effects.
• Minimize situations in which people’s egos
are at stake.
Influence and Persuasion
Influence & Persuasion in Organisations

• Unavoidable in the
workplace
• Sometimes leads to
discomfort, but...
• Influence can be used for
good (e.g., nudging
healthier behaviours)
• It’s important to avoid
being influenced for bad
(e.g., cults)
Two Routes to Persuasion
• Central Route = logical, rational, conscious
• Used for rational influence
• Better arguments are always more persuasive
• Strategies: present both sides of issue; demonstrate lack of
self-interest; aim for incremental – not absolute – change in
belief; beware of forewarning your audience

• Peripheral Route = emotional,


indirect, normative, usually not conscious
• Used for boosting your message
• Six widely known tactics/strategies (Cialdini)
Petty & Cacioppo, 1984
Persuasion Tactic #1: Reciprocity

• We comply more after we’ve received something;


we repay the “giver.”
• Why it works: universal norm of turn-taking and
exchange
• Can be resources, services, or even concessions
• Beware: could be perceived
as brown-nosing or bribing
• Defense: consider refusing
initial gift; reframe to see gift
as a sales ploy

Cialdini et al, 1975; Strohmetz et al., 2002


Persuasion Tactic #2: Liking

• We comply more with requests from people we like


(and who seem to like us).
• Why it works: it’s beneficial to trust the people we
like
• Compliments, ingratiation, similarity, attractiveness
• Beware: similar to reciprocity,
could be seen as brown-nosing
• Defense: separate the person
from the request

Cialdini, 1975
Persuasion Tactic #3: Commitment &
Consistency

• We comply more to positions that seem consistent


with our prior commitments/behaviours.
• Why it works: norms of consistency; easier to fall
back on previous positions
• Commitments need to be public,
effortful, and internally motivated
• Beware: moral licensing
• Previous good/moral behaviour makes people more likely
to engage in bad/immoral behaviour
• Defense: recognize sunk costs; consistency for
consistency’s sake isn’t always wise
Freedman & Fraser, 1966; Cialdini, 1975; Monin & Miller, 2001
Persuasion Tactic #4: Authority

• We comply with people who appear to be authorities


or have expertise.
• Why it works: authorities are often right; facts or
expertise often indicate knowledge
• Signals of authority = titles, clothing, cars
• Beware: dark side of authority
• can get high compliance for
unethical behavior
• Defense: ask yourself whether
Trust me,
I’m a
DOCTOR

the person is a real expert

Lefkowitz et al., 1955; Milgram, 1963; Doob & Gross, 1968; Deaux, 1971; Cialdini, 1975
Persuasion Tactic #5: Social Proof

• People often do what others are doing.


• Why it works: the widespread thing is often the right
thing – norms, information, etc.
• Similarity to reference group is key; more
compliance when uncertainty is high
• Beware: people don’t like to feel
manipulated
• Defense: probe for validity; even if
evidence is valid, not necessary to
comply with the crowd

Cialdini, 1975; Goldstein et al., 2008


Persuasion Tactic #6: Scarcity

• We tend to prefer something that is limited,


disappearing, or unavailable (in high demand).
• Why it works: lack of availability signals higher
quality; people don’t like having their choice limited
• Along with actual goods, even information about
scarcity works
• Beware: don’t want people
to feel too helpless
• Defense: use panic as a cue
to reconsider why you need
what’s offered
Cialdini, 1975; Knishinsky, 1982
Ethical Issues in Influence & Persuasion

• Ethically defendable
• Using peripheral route techniques to supplement a strong
argument that the central route would also support

• Ethically dubious
• Using peripheral route techniques to try to distract the central
route from a weak argument it wouldn’t support
• Also unlikely to work in the long run

• Test: Would your audience be upset if they knew about


your use of peripheral-route persuasion techniques?
Hiring Exercise
An activity where you will work to make a
hiring decision in a group

• You will be randomly assigned to a role:


• Senior VP of Marketing
• Senior VP of Human Resources
• Senior VP of Operations
• Senior VP of Sales
• Senior VP & General Counsel
Common Information Effect
Teams tend to discuss information that all
group members already know, and to ignore
information that is unique to individual team
members.
Why Do Group Members
Fail to Share Information?

1. Differences in underlying goals


2. Assumption that others have the
same information
3. Feel uncomfortable about speaking up
4. Uncovering common information confirms
its importance and relevance
Hiring Exercise Takeaways
• Information is often not shared b/c
groups decide on first acceptable
alternative, instead of looking for
the best alternative.
• People will often distort
information to fit with their
preferences.
• The structure of group decision-
making processes can prevent
groups from considering all the
information that is available.
Hiring Exercise Takeaways, cont.
• Suggestions for reducing the withholding of
information:
• Encourage each person on team/group to give all
info. before anyone votes or indicates preference.
• Focus team goal on making a decision that’s best
for the organisation, rather than on having their
own candidate “win.”
• Set a norm that team members are open to other
points of view.
• Provide reinforcement of this norm when unique info. is
shared.
• Have a “last chance” meeting – is this really the
best decision?

You might also like