(The 30 Minute Expert) The 30 Minute Expert Series - Thinking Fast and Slow in 30 Minutes - The Expert Guide

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 38

Copyright © 2013 by Garamond Press. Garamond Press is an imprint of Callisto Media, Inc.

A NOTE TO THE READER: You should purchase and read the book that has been reviewed. This
book is meant to accompany a reading of the reviewed book and is neither intended nor offered as
a substitute for the reviewed book.

This review is unofficial and unauthorized. This book is not authorized, approved, licensed, or
endorsed by Daniel Kahneman or Farrar, Straus and Giroux.

Garamond Press and the Garamond Press logo are trademarks of Callisto Media, Inc. and may
not be used without written permission. All other trademarks are the property of their respective
owners. Unless explicitly stated otherwise, Garamond Press is not associated with any individual,
organization, or product mentioned in this book.

No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any
form or by any means, except as permitted in accordance with the 1976 United States Copyright
Act, without the prior written permission of the publisher. Requests to the publisher for
permission should be sent to permissions@callistomedia.com.

For general information on our other products and services or to obtain technical support, please
contact our Customer Care Department at info@callistomedia.com.

The publisher makes no representations or warranties with respect to the accuracy or


completeness of the contents of this work and specifically disclaims all warranties, including,
without limitation, warranties of fitness for a particular purpose. No warranty may be created or
extended by sales or promotional materials. The advice and strategies contained herein may not
be suitable for every situation. This work is sold with the understanding that the publisher is not
engaged in rendering medical, legal, or other professional advice or services. If professional
assistance is required, the services of a competent professional should be sought. The publisher
shall not be liable for damages arising herefrom.

The fact that an individual, organization, or website is referred to in this work in a citation and/or
as a potential source of further information does not mean that the publisher endorses the
information the individual, organization, or website may provide or recommendations they/it may
make. Further, readers should be aware that websites listed in this work may have changed or
disappeared between when this work was written and when it is read.

ISBN: 978-1-62315-131-7 Print | 978-1-62315-132-4 eBook


Contents

Introduction: At a Glance

Understanding Thinking, Fast and Slow


About the Book
About the Author
Critical Reception
Synopsis

Key Concepts of Thinking, Fast and Slow


I. System 1 and System 2
II. Humans and Econs
III. The Remembering Self and the Experiencing Self

Conclusion: A Final Word

Key Terms

Recommended Reading

Bibliography
INTRODUCTION

At a Glance

This book is an extended review of Thinking, Fast and Slow, by Nobel Prize
laureate Daniel Kahneman. This internationally best-selling book is the
culmination of five decades spent becoming the leading psychologist in
the world today, as many reviewers of his book have called Kahneman. An
intellectual memoir plus so much more, this book is intended for a
mainstream audience, with the aim to provide a vocabulary for talking
about decision making and judgment.
This review begins with a brief presentation of the book and its
author. You’ll learn about the origins of Thinking, Fast and Slow, and you’ll
come away with an initial impression of how critics and readers have
responded to the book. You’ll also learn about Daniel Kahneman’s life
and work.
Next comes a short digest of readers’ responses to the book—the
good and the not so good, from professional reviewers as well as from
other interested readers.
The next two sections of the review offer a synopsis of Kahneman’s
book and a detailed discussion of its key concepts. Here you’ll find not
just examples of the key concepts in practice but also ideas for applying
them to your own life and experience. Finally, the main points of this
review are briefly restated, and you’ll be well prepared to get your own
copy of the complete book. Also included is a list of important terms
used in Thinking, Fast and Slow, and recommendations for further reading
about how humans think.
Understanding
Thinking, Fast and Slow

ABOUT THE BOOK


Daniel Kahneman wrote Thinking, Fast and Slow to bring the study of
decision making into common language and to provide the general
reader a vocabulary for diagnosing predictable errors in judgments.
Kahneman traces the origins of Thinking, Fast and Slow to his work
with his friend, the late psychologist Amos Tversky (1937–1996), with
whom he collaborated on work that led to Kahneman’s 2002 Nobel Prize
in Economic Sciences. Kahneman and Tversky met in 1969 and soon
discovered their shared interest in the relationship between intuition and
statistics. In 1974, they jointly published a still widely cited article,
“Judgment Under Uncertainty: Heuristics and Biases,” in the journal
Science. That article demonstrated systematic errors caused by humans’
“machinery of cognition,” which contrasted with the then widely accepted
notion that humans are rational beings whose emotions intrude on
rational thinking. They went on to publish “Prospect Theory: An Analysis
of Decision Under Risk” in 1984 in American Psychologist, a study that
became one of the foundations of behavioral economics.
Kahneman remains interested in refining human understanding of
judgment and decision making. In Thinking, Fast and Slow, he looks more
closely at the two “systems” that he claims underlie human thought:
System 1, which is fast, intuitive, and automatic; and System 2, which is
slow, deliberate, and effortful. Published in 2011 (Farrar, Straus and
Giroux), Thinking, Fast and Slow has sold more than one million copies
and has been praised by the Wall Street Journal (“[a] tour de force of
psychological insight”), Bloomberg/Businessweek (“a monumental
achievement”), and the Financial Times (a “masterpiece . . . one of the
greatest and most engaging collections of insights into the human mind I
have ever read”). Thinking, Fast and Slow was selected as one of the best
books of 2011 by the New York Times Book Review, the Globe and Mail,
and the Economist. It was one of the Wall Street Journal’s Best Nonfiction
Books of 2011 and won the Los Angeles Times Book Prize for Current
Interests.

ABOUT THE AUTHOR


Daniel Kahneman is Professor of Psychology and Public Affairs Emeritus
and Senior Scholar at Princeton University’s Woodrow Wilson School. He
is also the Eugene Higgins Professor of Psychology Emeritus at
Princeton, and Fellow of the Center for Rationality at Hebrew University.
He has held teaching positions in psychology at the University of British
Columbia and at the University of California, Berkeley.
Kahneman was born on March 5, 1934, in Tel Aviv and spent his
childhood in Paris, including during the Nazi occupation. Following his
father’s death in 1948, the family moved to the British Mandatory
Palestine just prior to Israel’s independence. He received a BS in
psychology and mathematics from the Hebrew University of Jerusalem in
1954. He then served in the Israeli Army and went on to obtain a PhD in
psychology from the University of California, Berkeley.
In addition to the 2002 Nobel Memorial Prize in Economic Sciences,
Kahneman has been awarded numerous prizes (e.g., the Talcott Parsons
Prize, Thomas Schelling Award for intellectual contribution to public
policy) and honorary degrees from many top centers of learning,
including the University of Michigan, University of Rome, Université de
Paris, Harvard, University of East Anglia, Ben-Gurion University, The New
School, and University of Pennsylvania. In 2011, Foreign Policy magazine
included him on its list of top global thinkers.
Kahneman has also written Attention and Effort (Prentice Hall, 1973),
has edited several volumes of essays, and has published hundreds of
essays and articles in prestigious journals, including Harvard Business
Review, Foreign Policy, American Economic Review, American Psychologist,
and Stanford Law Review.

CRITICAL RECEPTION

The Upside
Thinking, Fast and Slow received glowing reviews in major international
media outlets, including the New York Times, the Financial Times, the
Globe and Mail, the Guardian, the Wall Street Journal, the Washington Post,
the Economist, the Atlantic, Publishers Weekly, and the Chronicle Review.
Several reviewers declared it a landmark book: David Brooks called it “a
major intellectual event” in the New York Times; William Easterly in the
Financial Times called it a “masterpiece” and “one of the greatest and
most engaging collections of insights into the human mind I have read”;
and in the Globe and Mail, Janice Gross Stein wrote, “It is impossible to
exaggerate the importance of Daniel Kahneman’s contribution to the
understanding of the way we think and choose. He stands among the
giants, a weaver of the threads of Charles Darwin, Adam Smith and
Sigmund Freud.”
Michael Lewis, in his lively Vanity Fair feature, called Kahneman “the
world’s most distinguished living psychologist.” It turns out that
Kahneman’s ideas informed Lewis’s book Moneyball (later made into a
film, starring Brad Pitt), without Lewis even realizing it. In Moneyball,
Lewis had written about the Oakland A’s assistant general manager Paul
DePodesta, who had studied behavioral economics at Harvard and,
under General Manager Billy Beane, applied Kahneman’s concepts to
revolutionize scouting in baseball. Lewis came to realize: “When you
wander into the work of Kahneman and Tversky far enough, you come to
find their fingerprints in places you never imagined even existed. . . . It
didn’t take me long to figure out that, in a not so roundabout way,
Kahneman and Tversky had made my baseball story possible.”
On Goodreads.com, 93 percent of those who read the book liked it.
Many of these readers found it “richly rewarding” and appreciated how
Kahneman explains complex ideas in a way the lay reader can
understand. Several noted that that his book has forever changed how
they think about thinking.

The Downside
In the Huffington Post, David K. Levine called Thinking, Fast and Slow
“tedious” and “remarkably smug,” and found some historical oversights
(according to Levine, Nobel Prize–winning economist Maurice Allais
discovered utility theory was wrong in 1953, not Kahneman and Tversky in
1979). Levine also faulted Kahneman’s ideas about reference points and
the endowment effect:

“Unfortunately to build a theory based on the reference point is to


build on foundations of sand. . . . [I]f you give people a coffee mug
and ask them how much they will sell it back to you for, they state a
high price, while if you keep the mug and ask how much they will pay
for it, they state a low price. That’s not so surprising: we all know to
buy low and sell high. But researchers are cleverer than that: the
bidding rules are constructed so that you do best by stating your true
value in both cases—hence the conclusion that people like the mug
more if they have it than if they don’t. But do people really have an
‘endowment effect’? Or do they give a confused answer to a
confusing question?”

Some Goodreads.com readers called the book a “monstrous chore,”


“plodding,” and “clunky.” Some had difficulty with Kahneman’s
explanations of mathematical concepts, and detected flaws and
inconsistencies in his experiments. Even Michael Lewis finds
Kahneman’s terminology tough going, but he recognizes this as a trait of
psychologists in general: “The moment the psychologists uncover some
new kink in the human mind, they bestow a strange and forbidding name
on it (‘the availability heuristic’).”
In the New York Review of Books, Freeman Dyson’s mostly favorable
review lamented that Kahneman doesn’t discuss Sigmund Freud (as does
the Economist), philosopher William James, or the role of religion in
understanding human behavior. This is because Dyson holds Kahneman
on par with Freud and James:

“Admirers of Freud and James may hope that the time may come
when they will stand together with Kahneman as three great explorers
of the human psyche, Freud and James as explorers of our deeper
emotions, Kahneman as the explorer of our more humdrum cognitive
processes. But that time has not yet come. Meanwhile, we must be
grateful to Kahneman for giving us in this book a joyful understanding
of the practical side of our personalities.”

Some reviewers questioned the feasibility of applying Kahneman’s


work in society. William Easterly in the Financial Times contended that
Kahneman’s take on “libertarian paternalism” overlooks the fact that the
experts overseeing things like mandatory retirement savings plans “are as
prone to cognitive biases as the rest of us. Those at the top will be overly
confident in their ability to predict the system-wide effects of paternalistic
policy-making.” And Janice Gross Stein in the Globe and Mail questions
the logic of urging “people to take the emotion out of their decision-
making” since emotion is itself part of the experiencing self.
The Economist elegantly notes the flaws and oversights of Thinking,
Fast and Slow within the broader context of Kahneman’s overall
contribution to human thinking:

“Mr. Kahneman does not dwell on the possible evolutionary origins of


our cognitive biases, nor does he devote much time to considering
why some people seem naturally better at avoiding error than others.
Still this book . . . is a profound one. As Copernicus removed the
Earth from the centre of the universe and Darwin knocked humans off
their biological perch, Mr. Kahneman has shown that we are not the
paragons of reason we assume ourselves to be.”
SYNOPSIS
In Thinking, Fast and Slow, Daniel Kahneman looks at the thinking that
shapes judgment and decision making. He hopes to enhance the
everyday language about thinking to more accurately discuss, diagnose,
and reduce errors.
Thought, Kahneman explains, has two distinct systems: the fast and
intuitive System 1, and the slow and effortful System 2. Intuitive decision
making can be effective; however, Kahneman highlights situations in
which errors of judgment will occur unless System 2 is called in. Intuitive
reasoning is least reliable with decisions that require predicting the future
and assessing risks.
Decisions involving risk are at the mercy of System 1, which has a
high aversion to loss. Loss aversion is so powerful, it even trumps one’s
desire to gain and can lead to costly decisions, as Kahneman and
Tversky’s prospect theory shows. Prospect theory was an important
discovery not only in psychology, but also in economics, which had
always assumed people were rational and acted in their own self-interest.
Kahneman dubs these fictional personae Econs; he and Tversky believed
humans were much different from Econs, and their transformative
development of prospect theory engendered the field of behavioral
economics.
Beyond economic theory, Kahneman believes decision-making
research has practical applications that can directly impact people’s lives
—such as their sense of well-being. Thinking about well-being needs to
take into account our two selves—the experiencing self and the
remembering self. He developed the U-index to measure experienced
well-being (much more elusive than remembered well-being) in order to
enrich discussions about public policy, medicine, and welfare. Citizens
might be guided toward better decision making and more accurate
judgments, Kahneman suggests, if behavioral economics were applied to
policy making. And, he says, if countries included a measure of well-being
in national statistics, it could help reduce human suffering globally.
On an individual level, Kahneman believes the ability to diagnose
flaws in decision making has immediate personal value. Using an
enriched vocabulary for talking about decision making could help people
identify patterns in judgment and choices he has observed over the
course of his career. For example, calling the positive view audiences have
of handsome, confident speakers the “halo effect” helps people recognize
this intuitive bias when they see it again.
Kahneman’s ideal forum for the talk about decision making and
judgment, which he hopes will catch on, is the office water cooler; he
hopes first to improve how people gossip about others’ decision making,
because it’s easier to recognize others’ errors than one’s own. Ultimately,
he hopes everyone gains a better understanding of errors of judgment
and choice.
Key Concepts of
Thinking, Fast and Slow

Kahneman aligns humans’ two distinct modes of thinking—fast and slow


—with two “agents,” which he calls System 1 and System 2. We
constantly switch between the two when making decisions and
judgments, frequently falling prey to the biases and illusions that define
the difference between humans and Econs—the unified, rational agents
of traditional economics, who consistently act in their own best interest.
Kahneman also identifies two selves—the remembering self and the
experiencing self—when people talk about their lives.

I. SYSTEM 1 AND SYSTEM 2


Two agents handle human thinking: System 1 and System 2. Busy System
1 is fast, is intuitive, and cannot be turned off; it engages the automatic
mental activities of perception and memory. Sluggish System 2 handles
slow, effortful, deliberate thinking and is lazy. When System 1 presents a
plausible story, System 2 will often pass it through uncritically. One may
believe a rational choice has been made, when in fact it was not.
Kahneman looks mostly at System 1 and explores how it relates and
interacts with System 2.
Although most people self-identify with System 2, who concentrates,
reasons, performs complex mental tasks, makes choices, and is in charge
of self-control, most beliefs and choices actually begin with the automatic
impressions and beliefs of System 1, “the hero of the book.”
System 1 is constantly sorting through feelings and memories to
make suggestions to System 2, the decision maker. Usually, this process
serves one well. However, System 1 tends to have biases, and relies on
the most readily available answers, which can cause judgment errors
System 2 can’t detect. System 2 is too slow to sort through every
decision, and so the two end up compromising. In fact, System 2 can
seem reluctant to exert more effort than necessary to complete tasks. But
there are some tasks that only System 2, “the working mind,” can
perform, such collecting and analyzing data to make rational decisions.
When System 2 is at work, it shows: the pupils dilate, the heart rate
increases, and the inability to see anything other than the required task
takes hold. These are the physical signs of cognitive strain, a mental state
that makes one literally stop and think in order to reallocate energy from
other tasks while System 2 performs the work involved in making
choices.
Most of the time System 1 interprets one’s everyday perception of
reality. System 1 builds a model for what is normal in one’s world by
making automatic associations among situations, occurrences, and
outcomes. Over time, these associative links get stronger and help one
evaluate the present and the future. Because of shared references, people
are able to communicate with one another. Surprise occurs when an
event does not fit one’s model’s norms. Reluctant to turn to System 2 for
help, System 1 automatically tries to make it fit and assigns a plausible
cause for the aberrant item instead of admitting uncertainty, doubt, and
consulting statistics—all functions only System 2 can handle. However,
when busy or tired, lazy System 2 will not question data that aligns with
prevailing beliefs.
Quick to jump to conclusions, System 1 often falls prey to the halo
effect—accepting a favorable impression and automatically adding more
for cohesion (e.g., System 1 might automatically assume a smiling
cowboy in a white hat is a good guy, despite evidence he’s a maniacal
murderer). It is System 2’s job to curb the halo effect, but because
System 1 can’t determine what data is necessary, it draws conclusions
based on the nearest associative data and readily leaps to a conclusion
that favors its first bias. Kahneman calls this What You See Is All There Is,
or WYSIATI—a pervasive System 1 tendency that induces cognitive ease.
If System 2 is not jolted out of cognitive ease, it will likely endorse the
false conclusion. So thinking fast makes one overconfident in quick
decisions that ignore critical data for the sake of reinforcing a first
impression.
Another System 1 tendency is avoidance. To escape complex
questions (e.g., “How happy are you with your life?”), System 1
substitutes simpler, heuristic questions and answers those instead (e.g.,
“What is my mood right now?”). The heuristic question is adequate,
though imperfect, and so is its answer. It often satisfies System 2 without
further scrutiny. To avoid numbers, System 1 will seek the comfort and
coherence of the anchoring effect. When one hears a number or price
(e.g., a listing price on a house), it can influence one’s impression of an
estimate. A low anchor will prime us for low estimates and a high anchor
for high estimates, even with experienced estimators (e.g., real estate
agents). System 1 quickly creates a context in which the anchor value is
the correct number, when in fact it was simply the most available.
Because System 2 is lazy, people tend not to invest the effort and
cognitive strain required to analyze data and produce truly rational
decisions. They become susceptible to cognitive illusions. Two pervasive
types of cognitive illusion are the illusion of validity and the illusion of
skill. In the world of finance, traders feel confident in their skills and
abilities, and the financial industry supports this belief. The more
coherent a story, the more confidence people have that it’s valid, even if it
isn’t true. But statistics show that traders’ success at picking stocks is as
random as rolling dice. Even when Kahneman proved to directors of an
investment firm that their traders’ performance did not correlate with
financial outcomes, the firms continued rewarding traders based on luck
rather than skill. With System 1 processes operating constantly,
automatically, and unconsciously, intuitive decisions are routinely
mistaken for rational ones. A high level of confidence in someone making
forecasts is never a reliable measure of accuracy; in fact, low confidence
could be better, because it might prompt System 2 to step up to the plate.

Examples from Thinking, Fast and Slow


• System 1’s preference for what seems true rather than what is more
likely can fool even those with advanced training in statistics and
probability. Kahneman and his research partner invented a powerful
and controversial illustration of this fact in a fictional character named
Linda. Thirty-one, single, outspoken, and bright, Linda majored in
philosophy, cares about social justice, and participated in antinuclear
demonstrations at college. Kahneman’s research team asked students
—including students and some professors of probability—whether
Linda was more likely to have become (a) a bank teller, or (b) a bank
teller and feminist. Most picked (b). They were intuitively swayed by the
specific details about Linda and bypassed a basic rule of the logic of
probability: specifics narrow probability. They committed the
conjunction fallacy, thinking the conjunction of two events makes an
outcome more probable than one of the events. “Feminist bank teller”
seemed so plausible to System 1 that System 2’s capacity for deciding
probability wasn’t engaged.
• System 1 is constantly making associations, rapidly linking one idea
with another in plausible ways that instruct body and mind throughout
the waking day. If one hears the word eat and someone asks her to fill in
the missing letters in S _ _ P, she is more likely to spell soup than soap.
This is called priming. Eat primes the brain to think about food. Priming
can be a powerful tool; for example, it can affect voting: if a polling
station is in a school, voters are more likely to support increasing
school funding on a ballot. Priming can affect how one behaves toward
others: thoughts of money can make people behave more selfishly. And
it can affect how people react to authority: when reminded of mortality,
people are more accepting of authoritarian ideas, which can seem
reassuring. These behaviors are unconscious. System 2 believes it is an
autonomous agent, but the mind hides its inner workings from itself.

Applying the Concept


• Stop and think. The next time you’re tempted to send a text message
while driving, think twice. Kahneman’s research shows that it’s possible
to carry out two simple activities at the same time under cognitive ease,
such as driving and humming a tune. However, performing a series of
more complicated tasks—such as picking up a phone, opening an app,
and typing a message—engenders cognitive strain, forcing one to
literally stop and think as System 2 channels energy toward completing
each task.
• Math is not hard. Engaging System 2 is draining. In fact, if a task
becomes too difficult, one’s pupils contract, and System 2 seems to
give up, like an overloaded circuit breaker—or like a kid struggling with
math. You can avoid this overload by breaking down tasks, or math
problems, into multiple easy steps. If your child is struggling with a
problem, simplify it to the point that it would be impossible not to
understand, and then praise her when she gets the right answer. You’ll
help her overcome System 2’s lazy tendencies, and you’ll build
confidence in her abilities.
• Buy what you need. Grocery stores know about anchoring. In-store
promotions are designed to get you to buy more than you might need.
The next time you see “three lemons for one dollar” at the store, go
against the anchor and buy only one, if that’s all you need. You’ll pay
only thirty-three cents for it.

II. HUMANS AND ECONS


Standard economics presupposes people are rational beings—which
Kahneman calls the Econs—but this does not reflect how humans
actually decide things. Emotions play a much more significant role in
decision making than most people realize, and they hold a particularly
powerful influence over financial choices, especially when it comes to
avoiding loss.
Almost three hundred years ago, scientist Daniel Bernoulli theorized
that people don’t make money decisions based on numeric values, but
rather on the psychological values that result—their utilities. But
Bernoulli’s utility theory, according to Kahneman, is flawed because it
fails to consider a reference point from which to weigh options before
choosing. For example, persons A and B each have five million dollars
today. Yesterday, A had nine million dollars and B had one million dollars.
According to utility theory, they should be equally happy. However, in this
case, obviously A would be miserable to have lost four million dollars in
one day. Person A has a different reference point from person B.
Prospect theory addresses blind spots of utility theory, such as
reference points, but has blind spots of its own: disappointment and
regret. Sometimes money choices are based on whether or not a desired
item is already in one’s possession. If Professor R bought a rare bottle of
wine for thirty-five dollars and Professor T asks to buy it for one hundred
dollars, Professor R has to consider the pain of letting it go—this is called
the endowment effect. If Professor T does not yet own the wine and
wants it, then he has to consider the pleasure of getting it. As illogical as
it may be, people do not perceive the two points equally, but instead, tend
to assign the pain of loss greater weight.
Loss aversion can cause people to expend more effort to avoid loss
than to achieve gains, even when loss is unlikely. Similarly, people often
overweight the probability of rare events in decision making. Perceiving
an unlikely outcome, such as a terrorist attack, with vivid detail can
disrupt one’s ability to calculate its true risk. One then becomes prone to
overestimating its probability. Overweighting probability and
overestimating risk both stem from System 1 thinking. The more vivid a
possible outcome in one’s mind, the more weight the mind applies in
that outcome’s favor, and the less capable System 2 will be of making an
accurate assessment.
Worries about loss of wealth are not just economic; they are also
emotional. People keep mental accounts of all kinds, which is a form of
narrow framing that can lead to irrational money decisions. For example,
when one sells a stock to free up some cash, he most likely chooses one
that is up over the purchase price. This is the disposition effect. In fact, it
would likely be better to sell a losing stock since the winner might
continue to climb, and the loser is unlikely to turn around. Continuing to
invest in a losing proposition is called the sunk-cost fallacy—not a wise
choice. Sometimes one makes mistakes like these to avoid punishing
oneself with regret later on. Kahneman says a better way to alleviate
regret is to explicitly anticipate it. If the outcome is bad, remembering
that one considered the possibility of regret beforehand will decrease its
effect.
Life usually presents a narrow frame that allows people only single
evaluation choices. But Kahneman says people make more rational
choices when provided a broader, more inclusive frame. For example,
when asked separately to donate to efforts to (a) save endangered
dolphins, or (b) prevent skin cancer in farm workers, people often donate
more to dolphins. However, when presented these choices together, as a
joint evaluation, most people change their minds (making a preference
reversal) and donate more to farm workers. Comparative judgments
require System 2 and are likely to be more stable than single evaluations,
which are susceptible to System 1’s emotions.
System 1 is highly susceptible to emotional framing. If offered two
choices, (a) a 10 percent chance of winning ninety-five dollars and a 90
percent chance of losing five dollars, or (b) pay five dollars for a lottery
ticket that gives 10 percent odds of winning one hundred dollars and a 90
percent chance of winning nothing, most people would choose (b) even
though System 2 should tell them it’s exactly the same as (a). System 2 is
lazy, and without even realizing it, most people can be steered to make
decisions, from moral to financial, according to how they are framed.
People rarely have the chance to discover which of their preferences are
frame bound rather than reality bound. Only by engaging System 2 in
decision making can one escape these emotional biases.

Examples from Thinking, Fast and Slow


• Loss aversion can lead to poor decisions. Optimists, on the other hand,
can be too willing to gamble. When individuals and organizations face
decisions involving financial risks, it’s advisable to find the middle
ground between these two extremes. Kahneman recommends
combining the outside view with a risk policy (e.g., never buying
extended warranties). The outside view shifts one’s focus from the
immediate situation to the statistical outcomes of similar situations.
Adopting a risk policy helps the overly cautious through broad framing
—a general strategy that accepts occasional losses alongside
occasional gains. Adjusting one’s stock portfolio quarterly instead of
tinkering with it every day is another example of broad framing.
• Kahneman discusses two common features of intuitive decision
making: the possibility effect—weighting highly unlikely outcomes more
than they deserve, and the certainty effect—giving near-certain
outcomes less weight than their probability justifies. These effects are
features of the fourfold pattern:
1. When a gain is probable, the certainty effect makes most people risk
averse. They’ll pay a premium for certainty (e.g., they’ll accept a
settlement in a lawsuit they were likely to win).
2. When a loss is probable and a bad outcome almost certain, the
certainty effect makes people willing to take a risk to avoid it (e.g.,
they’ll refuse to pay a settlement in a lawsuit they’re likely to lose).
3. When a gain is unlikely, the possibility effect kicks in, and most
people are willing to take a risk (e.g., buy a lottery ticket).
4. When a loss is unlikely, the possibility effect now makes people risk
averse, and most will pay to avoid it (e.g., they’ll buy insurance).

Applying the Concept


• Hold the flame retardant. You’ve probably noticed all kinds of
businesses attempting to upsell you, be it in a restaurant—Do you want
fries with that?—or mattress stores. The latter often use narrow
framing to try to convince you to pay for a flame retardant spray. Don’t
fall for it. If you broaden the frame, you immediately realize that in the
unlikely event of a fire in your home, even if the spray worked, the
mattress would be so infused with smoke that you’d never want to
sleep on it again. Skip the spray and buy a carbon monoxide detector
instead.
• You’re getting a 10 percent raise! And then you find out your coworker,
with the same experience and responsibilities as you, already makes 20
percent more than you. Your coworker’s salary has become your
reference point, and now you’re not nearly as happy about your raise.
III. THE REMEMBERING SELF AND THE
EXPERIENCING SELF
Kahneman believes people possess two selves: the self that experiences
and the self that reports about it. These two selves are remarkably
distinct. What one actually experiences and what one ultimately
remembers are two very different things—especially when it comes to
happiness.
The experiencing self cannot speak, so the remembering self tells
stories about what happened—stories that typically neglect duration in
favor of the emotional peaks that stand out: the brief argument that
spoiled a couple’s otherwise long and pleasant evening dining out with
lifelong friends. Endings, in particular, get a lot of emphasis from the
remembering self: a delicious dessert that makes up for an otherwise
mediocre meal. Ultimately, one’s future preferences are shaped by
distorted accounts of one’s experiencing self: the couple prefer to spend
future evenings dining with less combative friends but decide to give the
mediocre restaurant another try. Memory is not the ideal resource to
consult to review the past for decisions about the future.
Most people remember emotional highs and lows that don’t reflect
their actual experience over the course of the event—in the preceding
example, before the argument, everyone was having a wonderful time,
despite the food, until dessert arrived; but the remembering self reverses
these. Ideally, one would like to extend pleasure and limit pain. However,
System 1’s weak grasp of duration makes it difficult to identify long
pleasure and short pains.
The experiencing self navigates one’s life. Meanwhile, the
remembering self constructs stories to look back on. In these stories and
in studies that reflect people’s evaluations of their entire lives, peaks and
ends count much more than duration. Whether a life is relatively long or
short, most people believe what is important is how one feels at the end.
Would a vacation seem worthwhile if all one’s memories of it were wiped
out at the end? Or would one willingly undergo surgery without
anesthetic as long as an amnesia-inducing drug were given at the end?
People are largely indifferent to the feelings of the experiencing self. The
remembering self is who one is.
According to Kahneman, happiness studies have traditionally focused
on life satisfaction, a function of the remembering self. To measure the
well-being of the experiencing self, or one’s experienced well-being,
Kahneman developed the U-index. The U-index measures the percentage
of time one spends per day in an unpleasant state: four hours of a
sixteen-hour day in an unpleasant state gives a U-index of 25 percent. If
two of those hours are spent commuting to and from work, one could
move closer to the office and cut that U-index in half. But happiness is
complicated. Focusing on experienced well-being to measure happiness
ignores the importance of life satisfaction to the remembering self. And
focusing on one’s overall life satisfaction ignores the importance of
experienced well-being.
Sometimes one thinks buying something, like a new car, will bring
happiness; Kahneman calls this affective forecasting, pointing out that
the thrill fades with time. In another example, maybe moving to a warmer
climate will make one happy. However, this is a focusing illusion that
over-weights climate and underweights other determinants of well-being.
As for whether or not money buys happiness, Kahneman has found that
beyond about seventy-five thousand dollars, household income had no
effect on experienced well-being. He believes this is because wealth
actually reduces the impact of life’s smaller pleasures. Higher income
earners do report greater life satisfaction; however, Kahneman believes
life satisfaction and experienced well-being are two different things.
To Kahneman, scientific advances in understanding happiness have
made it only more puzzling. Therefore, he has come to accept a hybrid
view of happiness that considers the well-being of both the experiencing
and remembering selves.

Examples from Thinking, Fast and Slow


• The peak rule and duration neglect. A man enjoyed listening to a long
symphony that had a cacophonous ending due to a bad recording.
Later, he judges the whole experience negatively, but he is confusing the
jarring conclusion with the actual experience. The most intense
moment—the emotional peak—was negative, and it has distorted his
memory. This is the peak rule. Basing his future musical preferences on
that distorted memory is called duration neglect.
• U-indexing the world. Knowing one’s U-index could help one decide to
spend more time doing enjoyable things. Ideally, this means reducing
the duration of unpleasant activities, such as commuting to work
(which many people report makes them unhappy), as well as of passive
pleasure, such as TV watching, and increasing active leisure, such as
socializing and exercise. Improving the U-index for society as a whole
would mean providing good pubic transportation, child care, and more
socializing for the elderly. A worldwide U-index reduction of 1 percent
would mean millions of hours less of global suffering.

Applying the Concept


• Reading the peaks. You have hit your sales targets for the eighth week
in a row, but your boss just told you overall sales are way down this
week. Wait until overall sales improve before asking about your bonus.
• Improve your U-index. You feel happier after some exercise, which you
enjoy even though you never find the time for it. Set your alarm a half
hour earlier in the morning and walk partway to work. You’ll improve
your U-index and your health.

Key Takeaways
• Kahneman articulates dual-process human thinking: Automatic,
intuitive, always-on System 1 makes most everyday decisions and
choices. But it is not good at solving problems because it overly
approximates and is prone to illusions, biases, and mistakes. Slow,
deliberate, analytical System 2 is the better problem solver. But it is lazy
and reluctant to question System 1’s conclusions, especially when they
seem plausible, cohesive, and convincing. Developing awareness of
these two systems and noticing them at work in others can help correct
and prevent mistakes.
• With these two systems in mind, it becomes readily apparent that
humans are not Econs. Humans are not always rational: System 1’s
intuitions and emotions play a huge part in decision making and risk
assessment, leading to mistakes from overconfidence, overweighting
probability, and overestimating risk, even among experts. By simply
slowing down before making important decisions and seeking
assistance from System 2, one can broaden the frame for a problem,
gather pertinent data, arrive at a decision, reflect upon it, and review it
for accuracy.
• Not only do we have two systems, we also have two selves—the
experiencing self and the remembering self—and they can give
surprisingly different reports on our sense of well-being. Aligned with
System 1, the remembering self reports inaccurately, according
disproportionate weight to emotional peaks and ignoring significantly
longer periods of time. It is our experiencing self, operating at an ever-
increasing distance from our remembering self, that knows the score.
One’s account of his or her happiness can be unreliable, unless it
accounts for both selves.
CONCLUSION

A Final Word

Kahneman believes decision-making research has practical applications.


The concept of considering the well-being of the two selves—the
experiencing self and the remembering self—for example, contributes to
richer discussions about public policy, medicine, and welfare. And if
countries included a measure of well-being in national statistics,
Kahneman suggests, it could help reduce human suffering globally.
Citizens might be guided toward better decision making and more
accurate judgments if behavioral economics were applied to nation policy
making. This is happening already, in the United States and abroad.
On an individual level, the ability to diagnose flaws in decision
making has immediate personal value. For the very reasons that
organizations make better decisions (they move more slowly and employ
useful checklists), anyone can prevent errors in their own judgment. By
simply slowing down before making important decisions and seeking
assistance from System 2, one can broaden the frame for a problem,
gather pertinent data, arrive at a decision, reflect upon it, and review it for
accuracy.
You have now glimpsed the culmination of Nobel Prize–winner
Daniel Kahneman’s many years of work on human thinking. You have
learned the key concepts of Thinking, Fast and Slow, but you haven’t
experienced Kahneman’s humor nor read his many telling anecdotes
firsthand. Now that this review has broken down some of the complex
ideas in Kahneman’s book, you will get even more out of reading the
complete book. Your System 2 will thank you.
Key Terms

anchoring the instance of a person’s estimate being skewed by a


previously stated value. When a particular value is attached to a quantity
or object before one has the chance to make an estimate (e.g., at a fine-
art auction or a used-car lot), one’s subsequent estimate will hew close to
the original value, anchoring the perceived value. This occurs because
lazy System 2 too readily accepts data from System 1, which is
susceptible to the powerful bias that anchors produce. Even experienced
real estate agents, when asked to estimate a house price after hearing a
high anchor, produce inflated estimates. Kahneman suggests training
System 2 to adjust away from the anchor. Studies have shown that simply
shaking one’s head upon hearing an anchor can help mitigate its effect.
cognitive ease a relaxed and gullible state in which one can become
susceptible to illusions. Good moods reign, one’s guard is down, and
trusting, positive, superficial thoughts rush in, including a propensity for
liking what one sees and hears. Repetition becomes appealing, even
though it produces illusions of familiarity (e.g., a name seen in print
multiple times begins to seem familiar). System 1 predominates.
cognitive illusions the errors of intuitive thought produced by
quickthinking System 1. There are many types of cognitive illusions:
biases, priming, the anchoring effect, the illusion of validity, the illusion
of skill, the conjunction fallacy, the disposition effect, etc. Kahneman
explains all of these and more in great detail. The best way to counteract
and avoid these mistakes in System 1 thinking is to actively engage
System 2; but since constant vigilance over one’s every waking thought is
impossible, the best one can do is become aware of the most likely
situations where cognitive illusions can cause errors of judgment.
cognitive strain a state where one is vigilant, critical and analytical, and
leads to rational thinking. When unhappiness or cognitive strain triggers
System 2 into action, vigilance takes the place of ease, and there is an
effort to ward off a perceived threat. One is more likely to reject System
1’s easy answers, and System 2 shifts from intuitive to analytic thinking.
The accuracy of one’s thoughts depends on cognitive ease or cognitive
strain.
duration neglect basing one’s future decisions on a distorted memory of
an experience. An idyllic monthlong boat cruise that ended with the ship
losing power during the final two days will be remembered badly. Most of
the cruise was pleasurable, but if the calamitous last two days cause a
person to decide never to go on a cruise vacation again, he or she is
committing duration neglect.
Econs Kahneman’s term for standard economists’ portrayal of human
beings as rational agents who act in their own best interest. Kahneman’s
two-system model presents a deep challenge to this rationality
assumption. System 1 features, such as biases, illusions, and framing
effects, ensure that when making choices, humans deviate from rules of
rationality.
experiencing self The self who navigates one’s life. Kahneman believes
people possess two selves: the self that experiences and the self that
reports about it. The experiencing self cannot speak, so the remembering
self tells stories about what happened—stories that typically neglect
duration in favor of the emotional peaks that stand out. Ultimately, one’s
preferences are shaped by distorted accounts of one’s experiencing self.
framing the deliberate and selective presentation of information meant
to elicit a specific emotional response. For example, describing sausages
as 90 percent fat-free makes them seem more healthful than saying they
contain 10 percent fat; both convey the same facts, but the distinct
framing induces much different results. Kahneman discusses different
types of framing: narrow framing, wherein one focuses one’s attention on
simple decisions and risks losing sight of the big picture; broad framing,
which pulls back to consider the bigger picture; and emotional framing,
which can be an effective way to manipulate decisions in everything from
health care to real estate to online dating. People rarely have the chance
to discover which of their preferences are frame-bound rather than reality-
bound.
halo effect the tendency to associate only positive qualities with
someone who makes a positive first impression—even if one knows very
little about the person—and vice versa. Kahneman noticed he graded his
psychology students uniformly over the course of a term; e.g., “A”
students tended to continue to get As on subsequent papers. When he
read their next papers without looking at their names first, the trend
changed. He realized the initial grades had produced a halo effect.
Moreover, even when the facts are in, the halo effect persists. When
Kahneman realized the halo effect had influenced his grading, he still had
the strong urge to hew to the initial grade (but resisted it).
heuristics mental shortcuts and rules of thumb that one tends to rely on
when making decisions and judgments. For example, in search of
coherence and plausibility, System 1 frequently relies on the most readily
available information, rather than the best or most ample information,
when making a decision, prediction, or estimate. This is the availability
heuristic.
peak rule an extremely positive or negative emotional peak that shapes
one’s memory of an experience; for instance, the burst balloon that
frightens a child and ruins her birthday party, or the delicious
complimentary dessert that makes up for an overcooked steak at a
restaurant.
prospect theory a psychological theory developed by Amos Tversky and
Daniel Kahneman concerned with how people make decisions about
simple gambles. Prospect theory is one of the foundations of behavioral
economics, and Kahneman was awarded the Nobel Prize for his
contribution to this field’s development. Unlike utility theory, prospect
theory takes the reference state into consideration. It defines outcomes
as gains and losses rather than as states, and reveals how losses have a
bigger subjective impact than do gains.
reference point People evaluate losses or gains relative to a value, or
reference point, they already have in mind. One will be happier about a
raise if it exceeds the amount one expected to get, and vice versa.
Kahneman explains how utility theory, a prevailing theory in standard
economics, fails to take into account reference points.
remembering self This self is one who is. Aligned with System 1, the
remembering self reports inaccurately, according disproportionate weight
to emotional peaks and ignoring significantly longer periods of time.
Kahneman believes people possess two selves: the experiencing self and
the self that reports about it.
System 1 the fast and intuitive system of thought. Fast thinking engages
the automatic mental activities of perception and memory. System 1
tends to have biases and relies on the most readily available answers,
which can cause judgment errors System 2 can’t detect. Psychologists
Keith Stanovich and Richard West originated System 1 and System 2 to
describe the brain’s two-system thought process. Kahneman personifies
System 1 and System 2 as agents. According to Kahneman most beliefs
and choices begin with the automatic impressions and beliefs of System
1, “the hero of the book.”
System 2 the slow and deliberate system of thought. Kahneman believes
most people self-identify with System 2, which concentrates, reasons,
performs complex mental tasks, makes choices, and is in charge of self-
control. System 2 is too slow and effortful to sort through every decision,
and so System 1 and System 2 end up compromising.
U-index Kahneman created the U-index in order for people and/or
society to measure the percentage of time spent per day in an unpleasant
state: four hours of a sixteen-hour day in an unpleasant state gives a U-
index of 25 percent. Ideally, one’s U-index is low. Knowing one’s U-index
could help one decide to spend more time doing enjoyable things. Ideally,
this means reducing the duration of unpleasant activities (e.g.,
commuting) as well as of passive pleasure (e.g., TV watching), and
increasing active leisure (e.g., socializing and exercise). U-indexes can be
used to consider ways of reducing suffering on a societal level—for
example, improving public transportation, paving roads, and providing
affordable health care and education can improve the U-index of a society
that suffers from the lack of such public goods.
utility theory a theory developed in 1738 by Swiss scientist Daniel
Bernoulli, who observed that people make financial choices based not on
particular dollar values, but on the psychological values of outcomes,
their utilities. This theory assumes that the utility of one’s wealth is what
makes one more or less happy. But, as Kahneman explains in Thinking,
Fast and Slow, utility theory looks at only the state of wealth to determine
its utility and fails to account for reference points.
WYSIATI Kahneman’s anagram for What You See Is All There Is. System
1 is not good at dealing with the unknown; it prefers to create plausible
stories out of accessible data, which, as long as it coheres, System 2
accepts as true. If information is scarce or of poor quality, System 1
simply jumps to conclusions on the basis of the most readily available
data. In this way, what is most apparent and accessible becomes all that
we see. WYSIATI engenders cognitive ease; it enables us to think fast and
to process complex information quickly. WYSIATI is also at the root of the
many cognitive illusions Kahneman identifies, including overconfidence,
framing, and base-rate neglect.
Recommended Reading

In addition to Thinking, Fast and Slow (Farrar, Straus and Giroux, 2011),
the following books—some mentioned by Kahneman—touch on similar
themes and subjects, and are well worth exploring:

Dan Ariely, Predictably Irrational: The Hidden Forces That Shape Our
Decisions (HarperCollins, 2008)
Psychologist and behavioral economist Dan Ariely aims to reveal the
systematic mistakes people make when making decisions based on
rational thought, in the hope that they will learn to avoid them.

Christopher Chabris and Daniel Simons, The Invisible Gorilla: How Our
Intuitions Deceive Us (Crown Publishers, 2010)
Psychologists Christopher Chabris and Daniel Simons reveal that people
may think they know how their minds work, but they are often surprised
to discover this isn’t the case. They invented the now famous Invisible
Gorilla Test, which reveals that when people focus on one thing, they
frequently overlook something else that is happening at the same time—
such as a gorilla walking across the frame of a video of people playing
basketball.

Atul Gawande, The Checklist Manifesto: How to Get Things Right


(Metropolitan Books, 2009)
Surgeon and journalist Atul Gawande’s The Checklist Manifesto reveals
the importance of organization and preplanning in everything from
medicine to disaster recovery to business.
James Surowiecki, The Wisdom of Crowds: Why the Many Are Smarter
than the Few and How Collective Wisdom Shapes Business, Economies,
Societies, and Nations (Doubleday, 2004)
Touching on economics and psychology, New Yorker columnist James
Surowiecki’s The Wisdom of Crowds is about how making decisions based
on information that has aggregated within groups can have better results
than individual decision making.

Nassim Nicholas Taleb, The Black Swan: The Impact of the Highly
Improbable (Random House, 2007)
Epistemologist Nassim Nicholas Taleb focuses on the human tendency
to make overly simplistic explanations for certain rare and unpredictable
events—a tendency, thanks to Taleb, now known as the black swan
theory.

Richard H. Thaler and Cass R. Sunstein, Nudge: Improving Decisions


About Health, Wealth, and Happiness (Penguin, 2008)
Behavioral economists Richard Thaler and Cass Sunstein’s Nudge
focuses on choices, looking at how people make them and how they can
improve them. It has influenced national policies that “nudge” people
toward good decisions without compromising their sense of freedom
(e.g., employee pension plans with opt-out, rather than opt-in, boxes).

Timothy D. Wilson, Strangers to Ourselves: Discovering the Adaptive


Unconscious (President and Fellows of Harvard College, 2002)
Psychologist Timothy D. Wilson reveals the hidden mental processes that
people use to evaluate the world, make goals, and take action. Wilson
demonstrates that people’s actions and others’ impressions of them can
reveal more about who they are than they realize.
Bibliography

David Brooks, “Who You Are”


New York Times, October 20, 2011
http://www.nytimes.com/2011/10/21/opinion/brooks-who-you-
are.html?_r=0

Christopher F. Chabris, “Why the Grass Seems Greener”


Wall Street Journal, October 22, 2011 http://online.wsj.com/article/​
SB1000142405297020447​95045766390321​03005502.​html

Freeman Dyson, “How to Dispel Your Illusions”


New York Review of Books, December 22, 2011 http://www.nybooks.com/​
articles/​archives/2011/​dec/22/how-dispel-your-illusions/?
pagination=false

William Easterly, “ ‘Thinking, Fast and Slow’: Why Even Experts Must Rely
on Intuition and Often Get It Wrong”
Financial Times, November 5, 2011
http://www.ft.com/intl/cms/s/2/15bb6522-04ac-11e1-91d9-
00144feabdc0.html#axzz2Q5Y6diDv

Evan R. Goldstein, “The Anatomy of Influence”


Chronicle Review, November 8, 2011 http://chronicle.com/article/the-
anatomy-of-influence/129688/

Jim Holt, “Two Brains Running”


New York Times Book Review, November 25, 2011
http://www.nytimes.com/​2011/11/27/​books/review/​thinking-fast-and-
slow-by-daniel-kahneman-book-review.html?pagewanted=all
Daniel Kahneman and Amos Tversky, “Judgment under Uncertainty:
Heuristics and Biases”
Science 185, no. 4157 (September 1974), 1124–31

Daniel Kahneman and Amos Tversky, “Prospect Theory: An Analysis of


Decision Under Risk”
Econometrica 47, no. 2 (March 1979), 263–92

Michael Lewis, “The King of Human Error”


Vanity Fair, December 2011 http://www.vanityfair.com/​culture/​
features/2011/12/​michael-lewis-201112

David K. Levine, “Thinking Fast and Slow and Poorly and Well”
Huffington Post, September 22, 2012
http://www.huffingtonpost.com/david-k-levine/thinking-fast-and-slow-
an_b_1906061.html

“Not So Smart Now: The Father of Behavioural Economics Considers the


Feeble Human Brain”
Economist, October 29, 2011 http://www.economist.com/node/21534752

Maria Popova, “The Anti-Gladwell: Kahneman’s New Way to Think about


Thinking”
Atlantic, November 1, 2011 http://www.theatlantic.com/​health/archive/​
2011/11/the-​anti-gladwell-kahnemansnew-way-to-think-about-
thinking/247407/

Christopher Shea, “ ‘Thinking, Fast and Slow,’ by Daniel Kahneman”


Washington Post, December 16, 2011
http://articles.washingtonpost.com/2011-12-
16/entertainment/35285045_1_danielkahneman-israeli-army-amos-
tversky

Janice Gross Stein, “ ‘Thinking, Fast and Slow’, by Daniel Kahneman”


Globe and Mail, November 25, 2011
Galen Strawson, “ ‘Thinking, Fast and Slow’ by Daniel Kahneman—
Review”
Guardian, December 13, 2011 http://www.guardian.co.uk/​
books/2011/dec/​13/thinking-​fast-slow-daniel-kahneman

“Thinking Fast and Slow”


Publishers Weekly, October 3, 2011 http://www.publishersweekly.com/978-
0-374-27563-1

You might also like