Mental Models List

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 9

General Thinking Concepts (11)

1) Inversion and The Power of Avoiding Stupidity


“All I want to know is where I'm going to die, so I'll never go there.” That thinking was inspired by the
German mathematician Carl Gustav Jacob Jacobi, famous for some work on elliptic functions. Jacobi often
solved difficult problems by following a simple strategy: “man muss immer umkehren” (or loosely
translated, “invert, always invert.”)
“[Jacobi] knew that it is in the nature of things that many hard problems are best solved when they are
addressed backward,” Munger counsels.
While Jacobi applied inversion mostly to mathematics, the model is one of the most powerful mental
models in our toolkit.
It is not enough to think about difficult problems one way. You need to think about them forwards and
backward. Inversion often forces you to uncover hidden beliefs about the problem you are trying to solve.
“Indeed,” says Munger, “many problems can't be solved forward.”
Let's take a look at some examples. Say you want to improve innovation in your organization. Thinking
forward, you'd think about all of the things you could do to foster innovation. If you look at the problem by
inversion, however, you'd think about all the things you could do that would discourage innovation. Ideally,
you'd avoid those things. Sounds simple right? I bet your organization does some of those ‘stupid' things
today. Another example, rather than think about what makes a good life, you can think about what
prescriptions would ensure misery. Avoiding stupidity is easier than seeking brilliance.
While both thinking forward and thinking backward result in some action, you can think of them as additive
vs. subtractive.
Despite our best intentions, thinking forward increases the odds that you'll cause harm (iatrogenics).
Thinking backward, call it subtractive avoidance or inversion, is less likely to cause harm.
Inverting the problem won't always solve it, but it will help you avoid trouble. You can think of it as the
avoiding stupidity filter. It's not sexy but it's a very easy way to improve.
So what does this mean in practice?
Spending time thinking about the opposite of what you want doesn't come naturally to most people. And yet
may of the smartest people in history, have done this naturally.
Inversion helps improve understanding of the problem. By forcing you to do the work necessary to have an
opinion you're forced to consider different perspectives.
If you're to take anything away from inversion let it be this: Spend less time trying to be brilliant and more
time trying to avoid obvious stupidity. The kicker? Avoiding stupidity is easier than seeking brilliance.

2) Falsification: How to Destroy Incorrect Ideas


“The human mind is a lot like the human egg, and the human egg has a shut-off device. When one sperm
gets in, it shuts down so the next one can’t get in.” — Charlie Munger
Sir Karl Popper wrote that the nature of scientific thought is that we could never be sure of anything. The
only way to test the validity of any theory was to prove it wrong, a process he labeled falsification. And it
turns out we're quite bad at falsification. When it comes to testing a theory we don't instinctively try to find
evidence we're wrong. It's much easier and more mentally satisfying to find information that proves our
intuition. This is known as the confirmation bias.
In Paul Tough's book, How Children Succeed: Grit, Curiosity, and the Hidden Power of Character, he tells
the story of an English psychologist Peter Cathcart Wason, who came up with an “ingenious experiment to
demonstrate our natural tendency to confirm rather than disprove our own ideas.”
Subjects were told that they would be given a series of three numbers that followed a certain rule known
only to the experimenter. Their assignment was to figure out what the rule was, which they could do by
offering the experimenter other strings of three numbers and asking him whether or not these new strings
met the rule.
The string of numbers the subjects were given was quite simple: 2-4-6
Try it: What’s your first instinct about the rule governing these numbers? And what’s another string you
might test with the experimenter in order to find out if your guess is right? If you’re like most people, your
first instinct is that the rule is “ascending even numbers” or “numbers increasing by two.” And so you guess
something like: 8-10-12
And the experimenter says, “Yes! That string of numbers also meets the rule.” And your confidence rises.
To confirm your brilliance, you test one more possibility, just as due diligence, something like: 20-22-24
“Yes!” says the experimenter. Another surge of dopamine. And you proudly make your guess: “The rule is:
even numbers, ascending in twos.” “No!” says the experimenter. It turns out that the rule is “any ascending
numbers.” So 8-10-12 does fit the rule, it’s true, but so does 1-2-3. Or 4-23-512. The only way to win the
game is to guess strings of numbers that would prove your beloved hypothesis wrong—and that is
something each of us is constitutionally driven to avoid.
In the study, only 1 in five people was able to guess the correct rule.
And the reason we’re all so bad at games like this is the tendency toward confirmation bias: It feels much
better to find evidence that confirms what you believe to be true than to find evidence that falsifies what you
believe to be true. Why go out in search of disappointment?

2b) Confirmation Bias: Why You Should Seek Out Disconfirming Evidence
The Basics Confirmation bias is our tendency to cherry-pick information that confirms our existing beliefs
or ideas. This is also known as myside bias or confirmatory bias. Two people with opposing views on a
topic can see the same evidence and come away feeling validated by it. Confirmation bias is pronounced in
the case of ingrained, ideological, or emotionally charged views.
Failing to interpret information in an unbiased way can lead to serious misjudgments. By understanding this,
we can learn to identify it in ourselves and others. We can be cautious of data that seems to immediately
support our views. “What the human being is best at doing is interpreting all new information so that
their prior conclusions remain intact.” — Warren Buffett
When we feel as if others “cannot see sense,” a grasp of how confirmation bias works can enable us to
understand why. Willard V. Quine and J.S. Ullian described this bias in The Web of Belief as such:
The desire to be right and the desire to have been right are two desires, and the sooner we separate them the
better off we are. The desire to be right is the thirst for truth. On all counts, both practical and theoretical,
there is nothing but good to be said for it. The desire to have been right, on the other hand, is the pride that
goeth before a fall. It stands in the way of our seeing we were wrong, and thus blocks the progress of our
knowledge.
Experimentation beginning in the 1960s revealed our tendency to confirm existing beliefs, rather than
questioning them or seeking new ones. Other research has revealed our single-minded need to enforce ideas.
Like many mental models, confirmation bias was first identified by the ancient Greeks. In The History of the
Peloponnesian War, Thucydides described this tendency as such:
For it is a habit of humanity to entrust to careless hope what they long for, and to use sovereign reason to
thrust aside what they do not fancy.
Our use of this cognitive shortcut is understandable. Evaluating evidence (especially when it is complicated
or unclear) requires a great deal of mental energy. Our brains prefer to take shortcuts. This saves the time
needed to make decisions, especially when we're under pressure. As many evolutionary scientists have
pointed out, our minds are unequipped to handle the modern world. For most of human history, people
experienced very little new information during their lifetimes. Decisions tended to be survival based. Now,
we are constantly receiving new information and have to make numerous complex choices each day. To
stave off overwhelm, we have a natural tendency to take shortcuts.
In “The Case for Motivated Reasoning,” Ziva Kunda wrote, “we give special weight to information that
allows us to come to the conclusion we want to reach.” Accepting information that confirms our beliefs is
easy and requires little mental energy. Contradicting information causes us to shy away, grasping for a
reason to discard it. In The Little Book of Stupidity, Sia Mohajer wrote:
The confirmation bias is so fundamental to your development and your reality that you might not even
realize it is happening. We look for evidence that supports our beliefs and opinions about the world but
excludes those that run contrary to our own… In an attempt to simplify the world and make it conform to
our expectations, we have been blessed with the gift of cognitive biases.
“The human understanding when it has once adopted an opinion draws all things else to support and
agree with it. And though there be a greater number and weight of instances to be found on the other
side, yet these it either neglects and despises, or else by some distinction sets aside and rejects.”
— Francis Bacon
How Confirmation Bias Clouds Our Judgment The complexity of confirmation bias arises partly from
the fact that it is impossible to overcome it without an awareness of the concept. Even when shown evidence
to contradict a biased view, we may still interpret it in a manner that reinforces our current perspective.
In one Stanford study, half of the participants were in favor of capital punishment, and the other half were
opposed to it. Both groups read details of the same two fictional studies. Half of the participants were told
that one study supported the deterrent effect of capital punishment and the other study opposed it. The other
participants read the inverse information. At the conclusion of the study, the majority of participants stuck to
their original views, pointing to the data that supported it and discarding that which did not.
Confirmation bias clouds our judgment. It gives us a skewed view of information, even when it consists only
of numerical figures. Understanding this cannot fail to transform a person’s worldview — or rather, our
perspective on it. Lewis Carroll stated, “we are what we believe we are,” but it seems that the world is also
what we believe it to be.
A poem by Shannon L. Alder illustrates this concept:
Read it with sorrow and you will feel hate.
Read it with anger and you will feel vengeful.
Read it with paranoia and you will feel confusion.
Read it with empathy and you will feel compassion.
Read it with love and you will feel flattery.
Read it with hope and you will feel positive.
Read it with humor and you will feel joy.
Read it without bias and you will feel peace.
Do not read it at all and you will not feel a thing.
Confirmation bias is somewhat linked to our memories (similar to availability bias). We have a penchant for
recalling evidence that backs up our beliefs. However neutral the original information was, we fall prey to
selective recall. As Leo Tolstoy wrote:
The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of
them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly
persuaded that he knows already, without a shadow of doubt, what is laid before him.
“Beliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by
evidence that most uncommitted observers would agree logically demands some weakening of such
beliefs. They can even survive the destruction of their original evidential bases.”
— Lee Ross and Craig Anderson
Why We Ignore Contradicting Evidence Why is it that we struggle to even acknowledge information that
contradicts our views? When first learning about the existence of confirmation bias, many people deny that
they are affected. After all, most of us see ourselves as intelligent, rational people. So, how can our beliefs
persevere even in the face of clear empirical evidence? Even when something is proven untrue, many
entirely sane people continue to find ways to mitigate the subsequent cognitive dissonance.
Much of this is the result of our need for cognitive consistency. We are bombarded by information. It comes
from other people, the media, our experience, and various other sources. Our minds must find means of
encoding, storing, and retrieving the data we are exposed to. One way we do this is by developing cognitive
shortcuts and models. These can be either useful or unhelpful.
Confirmation bias is one of the less-helpful heuristics which exists as a result. The information that we
interpret is influenced by existing beliefs, meaning we are more likely to recall it. As a consequence, we
tend to see more evidence that enforces our worldview. Confirmatory data is taken seriously, while
disconfirming data is treated with skepticism. Our general assimilation of information is subject to deep
bias. To constantly evaluate our worldview would be exhausting, so we prefer to strengthen it. It can also be
difficult to consider multiple ideas at once, making it simpler to focus on just one.
We ignore contradictory evidence because it is so unpalatable for our brains. According to research by
Jennifer Lerner and Philip Tetlock, we are motivated to think in a critical manner only when held
accountable by others. If we are expected to justify our beliefs, feelings, and behaviors to others, we are less
likely to be biased towards confirmatory evidence. This is less out of a desire to be accurate, and more the
result of wanting to avoid negative consequences or derision for being illogical. Ignoring evidence can be
beneficial, such as when we side with the beliefs of others to avoid social alienation.
Examples of Confirmation Bias in Action - Creationists vs. Evolutionary Biologists A prime example of
confirmation bias can be seen in the clashes between creationists and evolutionary biologists. The latter use
scientific evidence and experimentation to reveal the process of biological evolution over millions of years.
The former see the Bible as being true in the literal sense and think the world is only a few thousand years
old. Creationists are skilled at mitigating the cognitive dissonance caused by factual evidence that disproves
their ideas. Many consider the non-empirical “evidence” for their beliefs (such as spiritual experiences and
the existence of scripture) to be of greater value than the empirical evidence for evolution.
Evolutionary biologists have used fossil records to prove that the process of evolution has occurred over
millions of years. Meanwhile, some creationists view the same fossils as planted by a god to test our beliefs.
Others claim that fossils are proof of the global flood described in the Bible. They ignore evidence to
contradict these conspiratorial ideas and instead use it to confirm what they already think.
Doomsayers Take a walk through London on a busy day and you are pretty much guaranteed to see a
doomsayer on a street corner, ranting about the upcoming apocalypse. Return a while later and you will find
them still there, announcing that the end has been postponed.
In When Prophecy Fails, Leon Festinger explained the phenomenon this way:
Suppose an individual believes something with his whole heart; suppose further that he has a commitment to
this belief, that he has taken irrevocable actions because of it; finally, suppose that he is presented with
evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual
will frequently emerge, not only unshaken but even more convinced of the truth of his beliefs than ever
before. Indeed, he may even show a new fervor about convincing and converting people to his view.
Music Confirmation bias in music is interesting because it is actually part of why we enjoy it so much.
According to Daniel Levitin, author of This Is Your Brain on Music:
As music unfolds, the brain constantly updates its estimates of when new beats will occur, and takes
satisfaction in matching a mental beat with a real-in-the-world one. Witness the way a group of teenagers
will act when someone puts on “Wonderwall” by Oasis or “Creep” by Radiohead. Or how their parents react
to “Starman” by Bowie or “Alone” by Heart. Or even their grandparents to “The Way You Look Tonight”
by Sinatra or “Non, Je ne Regrette Rien” by Edith Piaf. The ability to predict each successive beat or
syllable is intrinsically pleasurable. This is a case of confirmation bias serving us well. We learn to
understand musical patterns and conventions, enjoying seeing them play out.
Homeopathy The multibillion-dollar homeopathy industry is an example of mass confirmation bias.
Homeopathy was invented by Jacques Benveniste, a French researcher studying histamines. Benveniste
became convinced that as a solution of histamines was diluted, the effectiveness increased due to what he
termed “water memories.” Test results were performed without blinding, leading to a placebo effect.
Benveniste was so certain of his hypothesis that he found data to confirm it and ignored that which did not.
Other researchers repeated his experiments with appropriate blinding and proved Benveniste’s results to
have been false. Many of the people who worked with him withdrew from science as a result.
Yet homeopathy supporters have only grown in numbers. Supporters cling to any evidence to support
homeopathy while ignoring that which does not.
Scientific Experiments “One of the biggest problems with the world today is that we have large groups of
people who will accept whatever they hear on the grapevine, just because it suits their worldview—not
because it is actually true or because they have evidence to support it. The striking thing is that it would not
take much effort to establish validity in most of these cases… but people prefer reassurance to research.”
— Neil deGrasse Tyson
In good scientific experiments, researchers should seek to falsify their hypotheses, not to confirm them.
Unfortunately, this is not always the case (as shown by homeopathy). There are many cases of scientists
interpreting data in a biased manner, or repeating experiments until they achieve the desired result.
Confirmation bias also comes into play when scientists peer-review studies. They tend to give positive
reviews of studies that confirm their views and of studies accepted by the scientific community.
This is problematic. Inadequate research programs can continue past the point where evidence points to a
false hypothesis. Confirmation bias wastes a huge amount of time and funding. We must not take science at
face value and must be aware of the role of biased reporting.
“The eye sees only what the mind is prepared to comprehend.” — Robertson Davies
Conclusion This article can provide an opportunity for you to assess how confirmation bias affects you.
Consider looking back over the previous paragraphs and asking:
 Which parts did I automatically agree with?
 Which parts did I ignore or skim over without realizing?
 How did I react to the points which I agreed or disagreed with?
 Did this post confirm any ideas I already had? Why?
 What if I thought the opposite of those ideas?
Being cognizant of confirmation is not easy, but with practice, it is possible to recognize the role it plays in
the way we interpret information. You need to search out disconfirming evidence.
As Rebecca Goldstein wrote in Incompleteness: The Proof and Paradox of Kurt Godel:
All truths — even those that had seemed so certain as to be immune to the very possibility of revision — are
essentially manufactured. Indeed, the very notion of the objectively true is a socially constructed myth. Our
knowing minds are not embedded in truth. Rather, the entire notion of truth is embedded in our minds,
which are themselves the unwitting lackeys of organizational forms of influence.
To learn more about confirmation bias, read The Little Book of Stupidity or The Black Swan.

3) Understanding your Circle of Competence: How Warren Buffett Avoids Problems


“I’m no genius. I’m smart in spots—but I stay around those spots.” — Tom Watson Sr., Founder of IBM
The concept of the Circle of Competence has been used over the years by Warren Buffett as a way to focus
investors on only operating in areas they knew best. The bones of the concept appear in his 1996
Shareholder Letter:
What an investor needs is the ability to correctly evaluate selected businesses. Note that word “selected”:
You don’t have to be an expert on every company, or even many. You only have to be able to evaluate
companies within your circle of competence. The size of that circle is not very important; knowing its
boundaries, however, is vital.
Circle Of Competence Circle of Competence is simple: Each of us, through experience or study, has built
up useful knowledge on certain areas of the world. Some areas are understood by most of us, while some
areas require a lot more specialty to evaluate.
For example, most of us have a basic understanding of the economics of a restaurant: You rent or buy space,
spend money to outfit the place and then hire employees to seat, serve, cook, and clean. (And, if you don’t
want to do it yourself, manage.)
From there it’s a matter of generating enough traffic and setting the appropriate prices to generate a profit on
the food and drinks you serve—after all of your operating expenses have been paid. Though the cuisine,
atmosphere, and price points will vary by restaurant, they all have to follow the same economic formula.
That basic knowledge, along with some understanding of accounting and a little bit of study, would enable
one to evaluate and invest in any number of restaurants and restaurant chains; public or private. It’s not all
that complicated.
However, can most of us say we understand the workings of a microchip company or a biotech drug
company at the same level? Perhaps not.
But as Buffett so eloquently put it, we do not necessarily need to understand these more esoteric areas to
invest capital. Far more important is to honestly define what we do know and stick to those areas. Our circle
of competence can be widened, but only slowly and over time. Mistakes are most often made when straying
from this discipline.
Circle of Competence applies outside of investing.
Buffett describes the circle of competence of one of his business managers, a Russian immigrant with poor
English who built the largest furniture store in Nebraska:
I couldn’t have given her $200 million worth of Berkshire Hathaway stock when I bought the business
because she doesn’t understand stock. She understands cash. She understands furniture. She understands real
estate. She doesn’t understand stocks, so she doesn’t have anything to do with them. If you deal with Mrs. B
in what I would call her circle of competence… She is going to buy 5,000 end tables this afternoon (if the
price is right). She is going to buy 20 different carpets in odd lots, and everything else like that [snaps
fingers] because she understands carpet. She wouldn’t buy 100 shares of General Motors if it was at 50 cents
a share. It did not hurt Mrs. B to have such a narrow area of competence. In fact, one could argue the
opposite. Her rigid devotion to that area allowed her to focus. Only with that focus could she have overcome
her handicaps to achieve such extreme success.
In fact, Charlie Munger takes this concept outside of business altogether and into the realm of life in general.
The essential question he sought to answer: Where should we devote our limited time in life, in order to
achieve the most success? Charlie’s simple prescription:
You have to figure out what your own aptitudes are. If you play games where other people have the
aptitudes and you don't, you're going to lose. And that's as close to certain as any prediction that you can
make. You have to figure out where you've got an edge. And you've got to play within your own circle of
competence.
If you want to be the best tennis player in the world, you may start out trying and soon find out that it's
hopeless—that other people blow right by you. However, if you want to become the best plumbing
contractor in Bemidji, that is probably doable by two-thirds of you. It takes a will. It takes the intelligence.
But after a while, you'd gradually know all about the plumbing business in Bemidji and master the art. That
is an attainable objective, given enough discipline. And people who could never win a chess tournament or
stand in center court in a respectable tennis tournament can rise quite high in life by slowly developing a
circle of competence—which results partly from what they were born with and partly from what they slowly
develop through work.
So, the simple takeaway here is clear. If you want to improve your odds of success in life and business then
define the perimeter of your circle of competence, and operate inside. Over time, work to expand that circle
but never fool yourself about where it stands today, and never be afraid to say “I don’t know.”

4) The Principle of Parsimony (Occam’s Razor)


Occam’s razor (also known as the ‘law of parsimony’) is a problem-solving principle which serves as a
useful mental model. A philosophical razor is a tool used to eliminate improbable options in a given
situation, of which Occam’s is the best-known example.
Occam’s razor can be summarized as such:
Among competing hypotheses, the one with the fewest assumptions should be selected.
In simpler language, Occam’s razor states that the simplest solution is correct. Another good explanation
of Occam’s razor comes from the paranormal writer, William J. Hall: ‘Occam’s razor is summarized for our
purposes in this way: Extraordinary claims demand extraordinary proof.’
In other words, we should avoid looking for excessively complex solutions to a problem and focus on what
works, given the circumstances. Occam’s razor is used in a wide range of situations, as a means of making
rapid decisions and establishing truths without empirical evidence. It works best as a mental model for
making initial conclusions before adequate information can be obtained.
A further literary summary comes from one of the best-loved fictional characters, Arthur Conan Doyle’s
Sherlock Holmes. His classic aphorism is an expression of Occam’s razor: “If you eliminate the impossible,
whatever remains, however improbable, must be the truth.”
A number of mathematical and scientific studies have backed up its validity and lasting relevance. In
particular, the principle of minimum energy supports Occam’s razor. This facet of the second law of
thermodynamics states that, wherever possible, the use of energy is minimized. In general, the universe
tends towards simplicity. Physicists use Occam’s razor, in the knowledge that they can rely on everything to
use the minimum energy necessary to function. A ball at the top of a hill will roll down in order to be at the
point of minimum potential energy. The same principle is present in biology. For example, if a person
repeats the same action on a regular basis in response to the same cue and reward, it will become a habit as
the corresponding neural pathway is formed. From then on, their brain will use less energy to complete the
same action.
The concept of Occam’s razor is credited to William of Ockham, a 13-14th-century friar, philosopher, and
theologian. While he did not coin the term, his characteristic way of making deductions inspired other
writers to develop the heuristic. Indeed, the concept of Occam’s razor is an ancient one which was first
stated by Aristotle who wrote “we may assume the superiority, other things being equal, of the
demonstration which derives from fewer postulates or hypotheses.”
Robert Grosseteste expanded on Aristotle's writing in the 1200s, declaring that:
That is better and more valuable which requires fewer, other circumstances being equal… For if one thing
were demonstrated from many and another thing from fewer equally known premises, clearly that is better
which is from fewer because it makes us know quickly, just as a universal demonstration is better than
particular because it produces knowledge from fewer premises. Similarly, in natural science, in moral
science, and in metaphysics the best is that which needs no premises and the better that which needs the
fewer, other circumstances being equal.
Early writings such as this are believed to have led to the eventual, (ironic) simplification of the concept.
Nowadays, Occam’s razor is an established mental model which can form a useful part of a latticework of
knowledge.
Examples of the Use of Occam’s Razor
In theology, Occam’s razor is used to prove or disprove the existence of God. William of Ockham, being a
Christian friar, used his theory to defend religion. He regarded the scripture as true in the literal sense and
therefore saw it as simple proof. To him, the bible was synonymous with reality and therefore to contradict it
would conflict with established fact. Many religious people regard the existence of God as the simplest
possible explanation for the creation of the universe.
In contrast, Thomas Aquinas used the concept in his radical 13th century work – The Summa Theologica. In
it, he argued for atheism as a logical concept, not a contradiction of accepted beliefs. Aquinas wrote ‘it is
superfluous to suppose that what can be accounted for by a few principles has been produced by many.’ He
considered the existence of God to be a hypothesis which makes a huge number of assumptions, compared
to scientific alternatives. Many modern atheists consider the existence of God to be unnecessarily complex,
in particular, due to the lack of empirical evidence.
Taoist thinkers take Occam’s razor one step further, by simplifying everything in existence to the most basic
form. In Taoism, everything is an expression of a single ultimate reality (known as the Tao.) This school of
religious and philosophical thought believes that the most plausible explanation for the universe is the
simplest- everything is both created and controlled by a single force. This can be seen as a profound
example of the use of Occam’s razor within theology.
The Development of Scientific Theories
Occam’s razor is frequently used by scientists, in particular for theoretical matters. The simpler a hypothesis
is, the more easily it can be proved or falsified. A complex explanation for a phenomenon involves many
factors which can be difficult to test or lead to issues with the repeatability of an experiment. As a
consequence, the simplest solution which is consistent with the existing data is preferred. However, it is
common for new data to allow hypotheses to become more complex over time. Scientists chose to opt for
the simplest solution the current data permits while remaining open to the possibility of future research
allowing for greater complexity.
Failing to observe Occam’s razor is usually a sign of bad science and an attempt to cover poor explanations.
The version used by scientists can best be summarized as: ‘when you have two competing theories that
make exactly the same predictions, the simpler one is the better.’
Obtaining funding for simpler hypothesis tends to be easier, as they are often cheaper to prove. As a
consequence, the use of Occam’s razor in science is a matter of practicality.
Albert Einstein referred to Occam’s razor when developing his theory of special relativity. He formulated
his own version: ‘it can scarcely be denied that the supreme goal of all theory is to make the irreducible
basic elements as simple and as few as possible without having to surrender the adequate representation of a
single datum of experience.’ Or “everything should be made as simple as possible, but not simpler.” This
preference for simplicity can be seen in one of the most famous equations ever devised: E=MC2. Rather
than making it a lengthy equation requiring pages of writing, Einstein reduced the factors necessary down to
the bare minimum. The result is usable and perfectly parsimonious.
The physicist Stephen Hawking advocates for Occam’s razor in A Brief History of Time:
We could still imagine that there is a set of laws that determines events completely for some supernatural
being, who could observe the present state of the universe without disturbing it. However, such models of
the universe are not of much interest to us mortals. It seems better to employ the principle known as
Occam's razor and cut out all the features of the theory that cannot be observed.
Isaac Newton used Occam’s razor too when developing his theories. Newton stated: “we are to admit no
more causes of natural things than such as are both true and sufficient to explain their appearances.” As a
result, he sought to make his theories (including the three laws of motion) as simple as possible, with the
fewest underlying assumptions necessary.
Medicine
Modern doctors use a version of Occam’s razor, stating that they should look for the fewest possible causes
to explain their patient's multiple symptoms and also for the most likely causes. A doctor I know often
repeats, “common things are common.” Interns are instructed, “when you hear hoofbeats, think horses, not
zebras.” For example, a person displaying influenza-like symptoms during an epidemic would be considered
more probable to be suffering from influenza than an alternative, rarer disease. Making minimal diagnoses
reduces the risk of over treating a patient, or of causing dangerous interactions between different treatments.
This is of particular importance within the current medical model, where patients are likely to see numerous
different health specialists and communication between them can be poor.
Prison Abolition and Fair Punishment
Occam’s razor has long played a role in attitudes towards the punishment of crimes. In this context, it refers
to the idea that people should be given the least punishment necessary for their crimes.
This is to avoid the excessive penal practices which were popular in the past, (for example, a Victorian could
receive five years of hard labour for stealing a piece of food.) The concept of penal parsimony was
pioneered by Jeremy Bentham, the founder of utilitarianism. He stated that punishments should not cause
more pain than they prevent. Life imprisonment for murder could be seen as justified in that it may prevent a
great deal of potential pain, should the perpetrator offend again. On the other hand, long-term imprisonment
of an impoverished person for stealing food causes substantial suffering without preventing any.
Bentham’s writings on the application of Occam’s razor to punishment led to the prison abolition movement
and our modern ideas of rehabilitation.
Crime solving and forensic work
When it comes to solving a crime, Occam’s razor is used in conjunction with experience and statistical
knowledge. A woman is statistically more likely to be killed by a male partner than any other person. Should
a female be found murdered in her locked home, the first person police interview would be any male
partners. The possibility of a stranger entering can be considered, but the simplest possible solution with the
fewest assumptions made would be that the crime was perpetrated by her male partner.
By using Occam’s razor, police officers can solve crimes faster and with fewer expenses.
Exceptions and Issues
It is important to note that, like any mental model, Occam’s razor is not failsafe and should be used with
care, lest you cut yourself. This is especially crucial when it comes to important or risky decisions. There are
exceptions to any rule, and we should never blindly follow a mental model which logic, experience, or
empirical evidence contradict. The smartest people are those who know the rules, but also know when to
ignore them. When you hear hoofbeats behind you, in most cases you should think horses, not zebras- unless
you are out on the African savannah.
Simplicity is also a subjective topic- in the example of the NASA moon landing conspiracy theory, some
people consider it simpler for them to have been faked, others for them to have been real. When using
Occam’s razor to make deductions, we must avoid falling prey to confirmation bias and merely using it to
backup preexisting notions. The same goes for the theology example mentioned previously – some people
consider the existence of God to be the simplest option, others consider the inverse to be true. Semantic
simplicity must not be given overt importance when selecting the solution which Occam’s razor points to. A
hypothesis can sound simple, yet involve more assumptions than a verbose alternative.
Occam’s razor should not be used in the place of logic, scientific methods and personal insights. In the long
term, a judgment must be backed by empirical evidence, not just its simplicity. Lisa Randall best expressed
the issues with Occam’s razor in her book, Dark Matter and the Dinosaurs: The Astounding
Interconnectedness of the Universe:
My second concern about Occam’s Razor is just a matter of fact. The world is more complicated than any of
us would have been likely to conceive. Some particles and properties don’t seem necessary to any physical
processes that matter—at least according to what we’ve deduced so far. Yet they exist. Sometimes the
simplest model just isn’t the correct one.
Harlan Coben has disputed many criticisms of Occam’s razor by stating that people fail to understand its
exact purpose:
Most people oversimplify Occam’s razor to mean the simplest answer is usually correct. But the real
meaning, what the Franciscan friar William of Ockham really wanted to emphasize, is that you shouldn’t
complicate, that you shouldn’t “stack” a theory if a simpler explanation was at the ready. Pare it down.
Prune the excess.
I once again leave you with Einstein: “Everything should be made as simple as possible, but not
simpler.”
Occam’s razor is complemented by other mental models, including fundamental error distribution, Hanlon’s
razor, confirmation bias, availability heuristic and hindsight bias. The nature of mental models is that they
tend to all interlock and work best in conjunction.

You might also like