Download as pdf or txt
Download as pdf or txt
You are on page 1of 69

THE CRITICAL THINKING EFFECT

UNCOVER THE SECRETS OF THINKING CRITICALLY AND TELLING FACT FROM


FICTION

THINKNETIC
Did You Know That 93% Of CEOs Agree That This Skill Is More Important Than Your College Degree?

Here's just a fraction of what you'll discover inside:

How to shortcut the famous Malcom Gladwell "10,000 Hours Rule" to become an expert critical thinker, fast
What a WW2 pilot and the people of Romania can teach you about critical thinking - this is the KEY to not making huge mistakes
Actionable, easy exercises to drill home every point covered in the novel. You won't "read and forget" this book
Our educational system simply doesn't teach us how to think...
...and it's unlikely this is information you've ever learned anywhere else - until now.
A glimpse into what you'll discover inside:

If your thinking is flawed and what it takes to fix it (the solutions are included)
Tried and true hacks to elevate your rationality and change your life for the better
Enlightening principles to guide your thoughts and actions (gathered from the wisest men of all time)

(Or go to thinknetic.net or simply scan the code with your camera)


CONTENTS

Introduction

1. What It Means To Be A Critical Thinker In This Day And Age


2. What Keeps Us From Getting To The Truth?
3. Why Having A Scientifically Skeptical Mind Helps You Discover The Truth
4. Why The Media Can Make Or Break Our Thinking
5. Everyday Lies And Deception
6. Pseudoscience Versus Science

Afterword
One Final Word From Us
Continuing Your Journey
References
Disclaimer
INTRODUCTION

Have you ever wondered why there seems to be so much misinformation out there? With fake news stories
reaching a hundred times more users on Twitter than real news stories, perhaps it is no wonder we have a problem
separating the facts from nonsense. That leads us to another question: who comes up with the facts, and how do
they know what is real? Why do some people believe official or scientific explanations, whereas others prefer to
believe alternatives?
Is deception an unavoidable part of human existence? Can we develop techniques to cope with this influx of
potential deception and get to the truth?
We all learned critical thinking at school and college, but how often do we apply it in everyday life? Even more
importantly, how often do we interact with others who fail to apply a critical thinking approach to their work and
personal lives? Many people happily believe in nonsensical ideas, whether these be innocent misconceptions or
(in some cases) dangerous misinformation. By developing your critical thinking skills even further, you can learn
to deal with these difficult people and present a coherent case for why they should perhaps research their views
further.
A critical thinking mindset also helps us be more efficient and effective at work, enabling us to focus on relevant
information and discard unconnected or inaccurate information. This leaves more time for the things and people
we enjoy. If you strive for balance, it is worth taking the time to hone these skills and learn to apply them in a
broader range of situations.
Many of us are perfectionists, and there is nothing wrong with that. We know that perfectionists deliver better
results due to their high standards, but why not streamline your perfectionism and do the same or better with less
effort? That is where the techniques in this book come in.
This book walks you through the details of using critical thinking, to identify the truth from non-truth. It contains
specific examples, illustrative stories, thorough explanations, and practice exercises; all backed up by a range of
scientific studies conducted by experts in their fields.
Each chapter is self-contained and logically organized, meaning you do not have to read it all in one sitting to
make sense of the contents. You will understand how to discern the truth in a variety of contexts, from online
news to scientific claims, to personal interactions. You will learn more about the scientific method, including
scientific skepticism and how to tell science from non-science.
You will also find out how your mind might block you from figuring out the truth with a discussion of common
biases, heuristics, and fallacies, including ways to overcome them. Knowing more about these will also help you
deal more effectively with difficult people, including those trying to sell you phony ideas or dodgy products.
By learning more about how to sort the truth from the lies, you can reach better conclusions and make better
decisions. You can also help others who do not yet have this knowledge to feel more confident and confident in
themselves.
This book’s author is Camilla J. Croucher. She graduated with a Ph.D. in cognitive science from the University of
Cambridge in 2007 and then completed a postdoctoral fellowship at City, University of London. Her academic
specialisms include emotional memory, visual processing, and judgment and decision-making. She is outstanding
at statistics but does not want to bore you with that. She has also worked in retail, clinical research management,
e-commerce, learning support, and (of course) freelance writing. Her interest in the illusions created by the eye
and mind remained constant across each of these roles, and she very much enjoyed composing this text for
you.The book is not a definitive guide on thinking critically, and it is not a philosophy text. Instead, it summarizes
reliable research findings and experts’ opinions and suggests techniques and practice exercises that you can use in
your daily life. So, are you ready to explore what it means to be a critical thinker?
1

WHAT IT MEANS TO BE A CRITICAL THINKER IN THIS DAY AND AGE

hancellor Sham Anderson took a deep breath as she hung up the phone. Marvin Keller, the Chair of the
C University Finance Committee, was not answering her calls. That morning's big news was the resignation of
the Director of Loofberger Inc., sparking the company's stock’s sudden plummet.
Naturally, Anderson now had serious concerns about the University's multi-million dollar investment in that very
company. The decision to invest in a new corporation touting a completely novel product was risky, of course.
However, the University had the funds despite huge pressure from the science and technology faculty.
Building a new space research center would allow the University to build probes to look for life on various moons
within our solar system. This would not be possible without a vast amount of money .
The company performed excellently from the outset, suggesting a low risk. The Finance Committee quickly
decided the kudos was worth it. They would invest in Loofberger.
Soon, the first hints emerged that Loofberger was in trouble. The product was not selling as expected. The share
price slowly declined. Nevertheless, Keller assured the Committee that this was a blip and that things would pick
up soon.
Sham startled as her cell phone rang: an unknown number. A somber voice confirmed that Loofberger had gone
bust: the University lost its entire investment.
Hindsight revealed several clues.
Marvin Keller had failed to disclose that his brother-in-law was also Loofberger's Director. As a lifelong friend as
well as a colleague, Anderson had not thought to question his recommendation. A few months after finalizing the
deal, Keller took an extended sabbatical to work on special projects at his lake house.
What about Loofberger's initial promising market data? It turned out to be too good to be true—a fairly clever
fake.
The University never got its space center, and many of those staff relocated to other, less embarrassing
organizations. Anderson’s peers advised her to take early retirement, which she did. The student body voted to
paint the canteen ceiling black in memory of the loss, and little paint chips flaked off, making it look like the night
sky.
The finance expert in this scenario had questionable motives, but nobody had thought to question them. Expert
advice is often trustworthy, but we should not take everything at face value. Question how you know that source
is an expert and how they know what they know.
Making complex decisions is difficult when we have limited information, but it is still important to investigate its
source and content by gathering evidence to support our decisions and conclusions.
Critical thinking is a powerful approach that can help you to make better decisions. Critical thinkers do not
passively receive information. Instead, they apply the rules of logic and their own experience to interpret the
messages they hear and see properly [1] .

Getting Started With Critical Thinking


Our intuitive reasoning processes work well for everyday purposes, but they may lead us in the wrong direction
when applying them in more complex situations. This is because our brains love a good shortcut. In contrast,
critical thinking reduces shortcuts and automatic processing: it is a self-disciplined process.
That is not to say that critical thinking is completely different from intuitive thinking and reasoning; it may be
more effortful, but anybody can think critically. The principles are fairly simple, but we know that people do not
apply them consistently.
Faulty reasoning is commonplace. People smoke, despite the known health dangers. They gamble to excess,
selling property and losing loved ones to feed their addiction.
Critical thinking is vital to avoid faulty reasoning pitfalls, as it helps us become better at solving problems.
Unfortunately, faulty reasoning can become habitual. For instance, if somebody reasons to themselves that
‘astrology is just a bit of fun, we do not need to prove it like a science,’ that is fine, but what if they apply the
same reasoning to the next pseudoscientific idea? This ‘slippery slope’ can make us vulnerable to more unproven
ideas and even manipulation [2] .
Even though critical thinking is composed of simple principles, you must practice getting better. The best critical
thinkers share several common traits [3] :
They ask questions : They identify relevant information and pair it with abstract ideas. They draw valid
conclusions and then test them.
They never assume that they got it right the first time : They try different approaches and ways of framing the
problem.
They conclude by reasoning, as opposed to rationalizing : Reasoning means using logic to conclude, whereas
rationalizing means finding a logic that fits the conclusion [1] . In other words, rationalizing is reasoning done
backward.
They are superb at connecting with others : This is not only about sharing their ideas clearly; it is about listening
and co-operation. You can even enhance your critical thinking by working with people with highly developed
critical thinking skills [1] .
So why is critical thinking important? For one thing, it helps us to distinguish between facts, opinions, claims, and
evidence. We must delineate these four closely connected concepts because others may use them to persuade,
misinform, or even manipulate us.

Facts
What exactly is a fact? More importantly, how do you know something is a fact? What if it is merely an opinion
or a claim?
A fact is a piece of information that we can verify. We can observe it, or we can find out it is true from a reliable
source. For example, the atomic number of carbon is six; the US Civil War took place between 1861 and 1865;
Armstrong was the first to walk on the Moon. These are all facts.
Without diving deep into philosophy, we can note here that ‘truth’ is an abstract idea. Nobody can say for certain
what is real (or fake, come to that). Later, when we look at scientific skepticism, we will examine this in more
detail. For now, let’s assume that truth can exist and that critical thinking at least brings us closer to it.
As a critical thinker, you should inspect any so-called fact you encounter. How do you know it is a fact and not an
assumption? Try to verify the ‘fact’ yourself; it may be an assumption if it cannot be verified. On investigating,
you may find the assumption is incorrect.
In some cases, assumptions are the best we can do. For example, if you were designing a novel product, at first,
you might assume that it would appeal to customers who buy related products. Later, you could gain evidence
using market research.
When you investigate a given fact, you might find that it is outdated or even a total misconception. Scrutinize
facts, and learn to recognize the good facts and reject the bad ones.

Opinions
Opinions may resemble facts, but they are subjective judgments. People often misrepresent opinions as facts,
perhaps because strongly held opinions may even feel factual. Opinions are always evaluative or comparative,
even if they use the same form as a fact by stating that something ‘is’ something. Saying that something is the best
must, therefore, be an opinion.
Take this statement :
“Joseph Bloggs is the best downhill skier because they have won the most gold medals.”
This sentence is an opinion based on a fact. You can verify or falsify the fact that they won most medals by
reading medals tables. The opinion that such a fact makes Joseph Bloggs the best downhill skier cannot be
verified: it is somebody’s perspective.
A new skier may be the best, even when they have not won anything yet. They might be able to beat Joseph
Bloggs in every race, but if medal count is the best measure of skiing ability, the new skier cannot be said to be
the best.
Our motivations, attitudes, and emotional states have huge effects on our opinions [2,4] . This renders opinions
vulnerable to all sorts of biases; not surprisingly, two people with identical information can very easily hold
opposite opinions. Of course, opinions can change completely over time and need not be based on facts at all.

Claims
Like opinions, claims are often wrongly presented as facts. Claims may be factual, but the definition of claim is an
assertion presented without proof. Therefore, distinguishing claims from facts is easy; you just need to check
whether the source supplies any evidence for the claim.
Claims can be implied rather than stated. ‘Before and after’ photos in beauty adverts are a good example. The
adverts may or may not overtly claim that the treatment improves the skin, but the skin certainly looks healthier in
the ‘after’ photo.
Companies produce adverts to make viewers spend money rather than showing them the truth, resulting in
advertisers presenting claims as facts. But claims crop up in the wild, too.
Conspiracists claim that mankind did not land on the Moon in 1969, but NASA faked the mission using camera
tricks in a television studio. We can say this is a claim because there is no evidence of the proposed fakery.
A fake Moon landing would entail faking a lot of evidence. Fake technical data and fake filmed footage are only
the beginning. NASA would have had to have persuaded their entire staff to give fake testimony, not to mention
fake paperwork.

Evidence
It is not just conspiracy nuts who persist even when faced with overwhelming evidence against their beliefs [2] .
We all do it. At times, we are all guilty of ignoring or misunderstanding evidence. This leads us to an important
question: what exactly is evidence, and how should we use it?
Evidence is an everyday term, but as critical thinkers, we need a more technical definition. Evidence refers to a
body of information that supports a given position.
We typically use evidence to prove or disprove a belief or decide whether a judgment or opinion is valid. Of
course, you need evidence from different sources.
A good body of evidence comes from multiple reliable sources. Imagine overhearing a conversation at a party.
Somebody claims that ‘investments are a great way to make money.’ A successful investor is listening; he nods
enthusiastically and starts bragging about the huge profits he has made. Wouldn’t you want to hear the other side?
The more evidence supports a conclusion, the more likely that conclusion is to be true. You might collect evidence
from pre-existing sources or decide to gather your own.
Picture a range of experts who are interested in why people fall into problem gambling. The medic does not agree
that sociology surveys are the best way to research this, but the sociology professor thinks they are the only way
that makes sense.
However, the two researchers would examine different aspects of addiction. The medic in this example decides to
look at physical differences in the bodies and brains of addicts and non-addicts; perhaps pre-existing variation
predicts who can gamble casually without becoming addicted. In contrast, the sociologist wants to look at
socioeconomic factors like gamblers’ family situations, housing issues, and poverty.
The gambling study could involve neuroscience, interviews with gamblers, big data science, and more, in addition
to surveys and clinical studies. All these approaches are helpful because they look at the problem at different
levels. The resulting body of evidence, taken together and processed according to good logic, could generate more
robust data than the medic or the professor alone. The group can investigate all potential causes of gambling and
compare how well all the different factors predict who becomes a problem gambler.
In conclusion, uncertainty is a good thing because it drives us to examine problems in more depth. You can never
gather all the facts or examine all the evidence. The best you can do is test your ideas and beliefs and improve
them as you go along, based on a wide range of evidence.

Facts Versus Non-Facts


Critical thinkers evaluate everyday information rather than simply absorbing it. You must be clear about the
division between facts and opinions, fiction, and emotions.

Facts Versus Opinion s


Firstly, discerning facts from opinions is vital. Remember, we can prove facts, whereas we cannot prove opinions
(subjective points of view). Alleged facts bombard us daily, but these are often opinions in disguise.
Business documents, and even scientific reports, sometimes report opinions as facts. Authors may commit this
error while chasing positive results to persuade others to agree with them. Sometimes, they may misunderstand
how to verbalize the information. For example, it is incorrect to present opinions based on data as though the
opinions were the data.
The same facts presented differently can lead to opposite conclusions. The danger is that readers then treat the
opinions as facts.

Facts Versus Fiction


Secondly, we need to separate fact from fiction. This is more difficult than it sounds. Labeled fiction is not a
concern, although fantasists may claim that it is somehow real. The internet is full of people who believe in
unlikely things.
One such unlikely belief is that aliens have left Stargates all around the universe, including on Earth. These
Stargates allow beings to travel instantly from one planet to another. You might recall a long-running TV drama
on this theme .
There are many more examples of people taking science fiction as fact. However, subtler categories of fiction do
exist. Conspiracy theories are good examples of probable fictions presented as facts, whether due to error, lies,
wishful thinking, or confusion [1,2] .

Facts Versus Emotions


It is also vital to distinguish facts from emotions. Just as we can feel so confident in opinions that we report them
as factual, our emotions can seduce us into behaving as though they too are real [5] . We also believe that
arguments that support our pre-existing attitudes are stronger [4] .
Telling facts and emotions apart may seem easy, especially if you regard yourself as a rational thinker, but it is not
always obvious.
Let’s consider a married couple having a heated argument. One yells at the other:
“I hate you! You’re always telling me what to do. I wish we’d never met!”
These ideas feel vivid and extremely real at the time, but once the speaker has cooled off and reconciled with their
partner, they will regret saying these words. They realize that the ideas were expressions of emotions rather than
the truth about their relationship .
Attraction forms the basis of many relationships, and attraction is another emotion that colors our perceptions.
Earlier in the relationship just mentioned, the partners might have said things like:
“You are the most beautiful person I have ever seen. You’re not like other people; you’re better.”
Again, these ideas feel concrete and factual at the time, but they are simply angry words from the argument in
reality. Positive emotion may be more enjoyable, but it is no more real than negative emotion.
So, how can we tell the difference between facts and emotions?
First, let’s draw a line between emotional quality and emotional state [6] .
Emotional quality is the emotion that somebody wants to convey in a picture, article, advert, or other messages.
Usually, charity adverts show people or animals in a pitiful state, perhaps crying children, to make the viewer feel
sad and guilty. In this case, sadness is the emotional quality of the advert.
Your emotional state is how you feel right now: calm, excited, wistful, or nervous.
You may feel the intended emotional quality in response to the charity ad or not. That is irrelevant. To critically
assess the message, you only need to appraise what emotion you are supposed to feel and be aware of it as you
process the message .
When assessing a claim or message, note whether the author is mistaking their own emotions for facts. People
often feel strongly about causes, and we know that emotions drive us to justify our decisions and actions.
We are all affected by our emotional states and past experiences. The tricky thing is that emotions feel urgent at
the time. Realizing that emotions affect your cognition is only the first step.
To compensate for this, we should try to step back from our own core effect (how we feel when we take in
information) instead of viewing the information objectively. Pause and recognize the emotion for what it is, a
separate entity from the information you are processing. Similarly, a message’s emotional quality is separate from
its meaning.
We need to compensate for emotions like this because our emotional states affect our reasoning processes
significantly. Moreover, we use our emotions to rationalize our decisions and behavior.
Shopping is a good example. Consumers often buy things they do not want or need, which is not a problem itself,
as it supports a healthy economy. Excessive spending can become an issue, though. For example, if somebody
cannot resist spending $300 on a new pair of sneakers when they also need to pay their rent, but they justify
buying the sneakers because they really want them.
Research suggests that clinical staff, such as nurses and doctors, regularly make decisions biased by emotions [7] .
Ideally, medical reasons, not emotional ones, should inform choices like whether to discontinue treatment.
Therefore, medics educate their trainees to be as objective as possible, even in the face of highly emotive
situations.
Emotional intelligence is hugely important for these clinicians. They must not let their emotional responses drive
their decisions, but at the same time, they must be empathic and kind to patients and their families. Nevertheless,
the study shows that decisions remain biased even under these conditions.
Whether you are a doctor or not, you probably make important decisions every day, so be mindful of your
emotional state in reading or receiving a message. Your emotional brain might be biasing you towards noticing
consistent information if you agree or disagree very strongly. We are more liable to agree with arguments that
support what we already feel [4] . This emotional bias means you could miss important details and facts.
In contrast to our innate emotions, we can deliberately learn critical thinking and logic [1,2] . As intelligent
animals, we have the power to figure out new ways of thinking, develop these throughout civilization, and pass
the techniques down to future generations.
Compare this to learning a language. We are born with the capacity for it but need to acquire the pieces (letter
sounds, words, and so on) and put them together into something meaningful and useful .
To assess information rationally and avoid mingling facts, opinions, fictions, and emotions, we need to practice
and use our critical thinking skills. Then, we can make informed judgments and decisions that are more likely to
be effective.

The Message Behind The Message


Before we can make sense of the information we receive, we must fully understand the entire message. We cannot
evaluate the information successfully if we do not investigate who made it and why: the content is only part of the
story.

Source Of Message
Firstly, find out about the source. Sources are individuals or organizations, and the following advice applies to
both.
A source may be an expert on some topics and naive about others. Sources may be biased, have special interests
in certain topics, or pet theories. They may be more or less reliable, more or less trustworthy. Think about the
following aspects of the source:
Is it an academic or government publication? We have to assume these are more trustworthy than commentators.
This is because their vested interest lies in providing accurate information for the population, whereas
commentators’ motivation is more variable .
Is the source paid (or rewarded in some other way) for conveying the message? Publishers can and do pay experts
to communicate specific information.
Where do they get their information? Is it a primary or secondary source? Secondary sources can misquote
primary sources. They might even treat other secondary sources as if they were primary sources. This magnifies
errors and misconceptions. Find the original information if you want to assess it fairly.
What does your expertise tell you? If the source is somebody you know, perhaps you know that they make
outlandish claims quite often. This could factor into your assessment.
When analyzing messages, especially from people you know, remember that people’s reasoning skills vary. The
source may not be aware of all the aspects just described, and they may feel that they have made a very good case.
Perhaps with a good debate, you can help them to improve.
At times, we all forget our deep-seated assumptions and motivations. Do not forget that critical thinking takes
practice.

Purpose Of Message
Next, examine why the source composed the message. Knowing a message’s purpose may alert you to possible
distortions and half-truths. What was their real motivation ?
Here, you need to view the message’s fine details. If it is on a website, what kind of website? For example,
somebody’s private blog has a different purpose from a government website. See whether they have declared any
interest in the products or topics they discuss; like influencers, blog writers are often given ‘freebies’ in exchange
for promoting the product.
A message might not be an obvious advert, but still be a promotional text. For example, companies often feature
blogs about their products and services; you would not necessarily take these texts at face value. Instead, think
about what interest the company might have in the topic: web traffic, affiliate links, direct purchases, or simply to
get you reading more about pet insurance.
People make persuasive messages for many reasons, and they can be subtle. Analyze the language to detect
whether the message might be covert persuasion rather than unbiased information. Persuasive texts may feature
many adjectives and adverbs, chatty language, and high-school rhetorical devices like alliteration and the ‘rule of
three.’
Word choices also reveal the author or speaker’s biases and opinions. Say you are reading reviews of a scientific
book about climate change. One reviewer refers to the ‘climate scare,’ whereas the other calls it the ‘climate
emergency.’ They have a different opinion, but in the context, both phrases mean the same thing .
Another aspect of purpose is that the source may prefer one conclusion or decision from the outset. They might
then filter out and distort the evidence to support the position they have already chosen. You can tackle this issue
by using alternative sources to research the topic and filling in those gaps yourself.

Field Of Work
As well as the source’s motivation and the message’s purpose, you must understand at least something about their
field of work. This is even more important if it is not your specialty. You need to get to grips with the basics.
Firstly, what are the fundamental goals? Imagine a hospital where radiographers and nurses work together to
produce and analyze magnetic resonance images. Radiographers aim to produce the best images possible, whereas
nurses aim to keep patients comfortable and well-informed about the procedure. Sometimes these goals might
clash since the scanning procedure is uncomfortable and noisy. Specialist staff at all workplaces need to work
together in this way to be effective.
Similarly, to assess the truth or falsehood of a message, you must understand the sphere the source of the message
works in. This contextual information enables you to judge the message on its own merits. Further, there is no
point judging the quality of a radiographer’s work in the same way you would judge nursing care .
Secondly, what basic concepts or assumptions does the source employ? Individuals may not even be aware of
their basic assumptions, but you, as a critical thinker, should be able to discern them.
In everyday life, a basic assumption might be that when you enter a table service restaurant, you wait in line, and
then somebody shows you to a table. You do not have to ask somebody what to do; you just know. Similarly,
physicists assume that light’s speed is a universal constant; they do not attempt to measure it in every experiment.
Finally, what kinds of data do they use to expand their knowledge and inform their decisions? Whether you agree
with the specific methods or not, try to assess them fairly rather than from a prejudiced position. Be flexible yet
rigorous, like a scientist. Research the message behind the message you receive, and put your critical thinking
skills to good use.

Reliability And Reputation Of Source


Critical thinking is an extremely powerful approach, which most people do not use most of the time. One way to
dissect claims more effectively is to be aware of our psychological tendencies. A further strategy is gathering
evidence from multiple sources and assessing the sources’ reliability and reputation.
Additionally, we can use critical thinking techniques to assess the reliability and reputation of the information
source. When a piece of information seems factual, the next step is to analyze its reliability.
Look at the source of information : Is it primary or secondary? Primary sources are the originators of the
information, whereas secondary sources are based on primary sources.
If secondary, check whether they report accurately and completely : Sometimes, secondary sources can leave
things out or report information selectively to support their own argument, which may differ from the primary
source’s argument. Find the primary source and see how different it is. You may observe that the secondary source
reports their opinions about the facts.
Now, look for evidence of reputation. If the source claims they are a world authority on kidney tumors, find out
more about them.
Where do they work? University, hospital, or private company? You can then assess their employer’s reputation as
well.
What do other experts (or world authorities) say about them? Are they mainstream or fringe? Accepted authority
or controversial?
Have they won prestigious awards, grants, or contracts? These all indicate that somebody is recognized and
successful in their field.
What organizations do they belong to? This could include overarching professional bodies (e.g., a therapist
belonging to the American Psychological Association) or more niche bodies (e.g., the Society For Neuroscience).
What else have they written about? If somebody has written a huge amount on diverse topics, they may be
journalists or interested amateurs, rather than an expert on the particular topic.
By assessing the reliability of information sources, we can exclude those with weak foundations. An accepted
authority on a subject is likely to be a more reliable source than a relatively inexperienced person who is new to
the field.
Similarly, a person or organization with a sharp focus on the subject at hand is likely to be more reliable than a
generalist since their experience and knowledge run so much deeper. They are also more likely to base their
information on primary sources rather than secondary or tertiary sources.
Once you have determined the source’s reliability and reputation, you can put this together with the other factors
we have discussed. This framework enables you to interpret messages more actively, empowering you to make
better decisions and arrive at more accurate conclusions.
Action Steps
Now that we have explored the features of critical thinking and how to interpret messages better, it is time to put
some of these ideas into action. Try these suggested exercises.
1. The Fact Check
Identify a purported fact, either from your work or the media. This can be anything, as long as you can read it in
context and research it to analyze it.
Suggestions:

There is life on other planets in the universe.


Humans only use 10% of their brains.
People with a college degree earn more money than people without.

Use critical thinking to evaluate your chosen fact systematically. Use these questions as guidance:

Where does the fact come from?


Is it a reliable source?
How do you know?
What is the author's motivation for presenting this fact in this context?
What is the evidence for this fact?
Is the evidence presented objectively, or is there evidence of bias?
Is it a fact, or is it an opinion?
Has emotion influenced it?
Is the source using emotion to try to influence you (the reader? )

Feel free to ask any other relevant questions you can think of, based on what we have looked at in this chapter.
Now that you have done this once, you have a framework for assessing messages you receive using critical
thinking.

2. Observational Study
Firstly, visualize a person you think has good critical thinking skills. Write a few notes about them using the
questions below, or make a mind map.
What kind of person are they? What have they said or done that makes you think they are great at critical
thinking? What outcomes do they produce?
Examine the evidence you have written down, and conclude whether this person is a good critical thinker. Perhaps
bring this exercise to mind next time you speak to them or witness their critical thinking, and make a few more
observations.
Now repeat the same process for somebody you think has poor critical thinking skills, including what makes you
think they are bad at critical thinking. Put your notes or mind maps side by side and compare them.
This exercise will help you focus on the good (or bad) critical thinkers’ traits and behaviors. It also starts you
thinking about the real-world applications of critical thinking .

Summary
In the story at the beginning, the University relied on its staff to disclose conflicts of interest, and they trusted the
market data that the company reported. However, multiple factors, including misplaced trust in Keller, led them to
invest in a failing company.
A poor decision cost the University more than just money. Why? Could this have been prevented if the Finance
Committee had applied what we had learned in this chapter? Perhaps.
Emotions played a role in the investment: the desire for success, trust in Keller. They appraised the company’s
success incorrectly due to inadequate evidence (they relied on the market data). Keller, the investment
recommendation source, turned out to be unreliable due to having a personal interest in the company.
In the story, the University did not have all the information needed to make the correct decision. No doubt, you
will have been in similar situations yourself. Hopefully, the techniques covered so far have equipped you with
more tools to deal with information you encounter in the future.
Apart from features of the information we receive, what else keeps us from getting to the truth? The answer is
complex, and we will delve into it very soon in the next chapter .

Takeaways
1. Critical thinkers must distinguish between facts, opinions, claims, and evidence.
2. You should be realistic and even humble about your knowledge. However, pairing logic with your own
experience is a key part of thinking critically.
3. Remember to assess the author and their motivation, as well as the message.
4. Use multiple reliable sources, including other people, to help you reason towards better conclusions and
decisions.
2

WHAT KEEPS US FROM GETTING TO THE TRUTH?

om! Dad! I need to speak to you!” the kid yelled. He had just got back from his first day at grade school,
“M and he had serious beef with his parents.
“What is it?” asked the concerned parents.
“The other kids all laughed at me.”
A sad tale of juvenile bullying, you might think. Yes, but there was more to it. The kid had started school with
something fairly crucial missing from his social life.
His parents were overjoyed when he was born. As high achievers themselves, they wanted their children to do
well in life.
The kid’s father had heard about an interesting research study. He spoke with his spouse, and they both agreed it
could not harm their child .
The study was the famous Mozart Effect. First published in the early 1990s, this experiment indicated that
students who listened to Mozart did better on certain cognitive tests than those who did not listen to Mozart [8] .
The students performed as though their IQ was 8-9 points higher than those who listened to a relaxation tape or
silence. Furthermore, a prestigious scientific journal published the study.
This got parents, as well as scientists, very excited. Everybody wanted to grab those extra IQ points for their
child. There may even have been a boom in baby headphones’ sales and Best Of Mozart CDs (this was the 1990s,
remember).
Our family took this to an extreme, however. The kid had passed unnoticed through kindergarten, but by grade
school, his deficit was apparent. Shockingly, he had never listened to anything other than Mozart.
More, his test scores were average at best, and he was the victim of several bullying incidents within the first few
weeks of school.
That was when his Mom decided to investigate further.
Scientists found the Mozart Effect very hard to replicate, but they kept trying. More often than not, Mozart
listeners performed about as well as those who listened to different music or silence [9,10] .
The kid's Mom also found out that the cognitive enhancement effect was small and probably only lasted a while
after the music finished — anything mildly stimulating made people do a bit better on the tests.
What she regretted, though, was naming her son Wolfgang.
With the Mozart effect, one experimental study became so well-known that people did not even notice the
subsequent studies. Other studies were less dramatic and therefore did not grab the parent’s attention.
Is Mozart special? In a musical sense, of course. But there is probably not a special Mozart module in the brain
that switches on superior learning processes.
The failure to replicate the Mozart Effect suggests that the original effect was due to general characteristics of the
music, like complexity or interestingness. Aspects of the experimental situation might also have led to these
seemingly impressive results [9,10] .
Recent analysis suggests that scientists published more ‘Mozart-positive’ results due to publication bias. This is
similar to confirmation bias, which we will look at in detail in this chapter.
Our brains construct our perceptions [1] and memories, so we need to constantly evaluate and question our ideas.
Our brains construct an impression of a three-dimensional world based on a two-dimensional projection on the
retina. We perceive three dimensions even in two-dimensional drawings in a way that feels automatic [11] .
Similarly, the first idea that comes to mind from memory could easily result from our cognitive processes rather
than being a true reflection or record of reality. Therefore, we must strive to become more aware of how our
minds can distort reality.
Thinking critically (or using sound reasoning) is not as simple as it seems. All of us harbor deeply ingrained
habits that influence our judgment of people, events, situations, and issues.

How Our Brains Short-Circuit Our Logic


Thinking logically, like any variety of thinking, uses our everyday cognitive processes and systems. In brief, these
include attention, memory, judgment, as well as decision-making and problem-solving. Beliefs, emotions,
fallacies, biases, and heuristics all affect our cognitive processes.
We have to be realistic about this, but not unduly pessimistic. Our judgment and decisions will tend to be pushed
in different directions by distorted perceptions and memories.
Fortunately, we can reduce these tendencies by knowing what the distortions are and how they work. We can
compensate for them, but firstly, we should define and explain each of the key terms.

Belief s
Beliefs are an important part of human life. We all hold prior beliefs about things, people, and ideas, and one
generation passes them on to the next via social learning. Sometimes, we believe what we want to believe despite
evidence against it; we can refer to this as wishful thinking [12] .
So, where do erroneous beliefs come from? Our brains do not intend to deceive us, but knowing the truth is not
always their main concern. Erroneous beliefs are a byproduct of the psychologically adaptive process of social
learning [2] . Social learning supports many useful tasks, such as learning our native language. As social creatures,
we need social cohesion and shared experiences, and we start paying attention to other humans (and potentially
learning from them) as infants [13] . So, it is only natural that we are so open to acquiring ideas directly from
others, especially those we trust.
Second-hand information has great potential to lead to false or distorted beliefs. Humans love to tell good stories,
and the storyteller may highlight certain aspects and ignore others, either to make the story more entertaining or to
emphasize certain parts of it [2] .
In turn, prior beliefs can lead to biased perceptions of people, objects, and events, thereby affecting future
perceptions and experiences. People can then pass these biased beliefs onto others. This may remind you of the
children’s game Telephone or Chinese Whispers, in which one person whispers a verbal message to the next along
a long line. The original message disappears by the end of the game.
Another aspect of our beliefs is that we tend to believe what we want to believe [2] , and this includes our beliefs
about ourselves. We may adopt socially acceptable beliefs to avoid being rejected by others [1] . Like many of our
psychological tendencies, there is nothing wrong with this, but at times it could obstruct our critical thinking.

Emotions
Social emotions such as trust and the desire for acceptance can affect what we believe, but emotions have huge
effects on cognition. Psychologists have documented mood congruent effects in memory and attention [14,15] .
This means that people tend to notice and remember information that fits with their current mood; you may
observe this phenomenon casually in everyday life now that you are looking for it. For example, when somebody
feels joyful, they might notice beautiful scenery or enjoy a good meal more than when they are in a neutral mood.
Our emotions, therefore, influence not only what information goes in but also how our minds process it.
In controlled experiments, a scared or sad person is more likely to perceive others’ faces as threatening or
negative. Someone experiencing a happy, exuberant mood is more likely to label faces as friendly. The first
person might be more likely to recall unpleasant events from their own life, whereas the second would recall more
happy and joyful experiences [14,15] .
This example illustrates that memory retrieval is an active process; your memory is not like a library issuing you
the same memory every time. Instead, the cognitive system reconstructs the memory each time [1] .

Fallacies
The term fallacies often refer to commonly held false beliefs, including some examples of folk wisdom. For
example, many people believe that more babies are born during the full moon [2] . In fact (verifiable, reliable fact,
that is!), no more babies are born on the full moon than during any other phase of the moon.
False belief fallacies can affect our reasoning processes if we assume that pieces of received wisdom are true
without examining them in more detail.
Fallacies also refer to logical fallacies. These are errors of reasoning commonly known as non-sequiturs. To
reason properly, we must make sure that our conclusions follow logically from our arguments’ premises. The
study of logical fallacies has a lengthy history, and there are many of them [1] .

Heuristics And Biase s


Biases are another important feature of the cognitive system that affects how our brains absorb and process
information. Attentional biases send our attention towards or away from certain things. We also experience
unintentional biases in decision-making and judgment.
An interesting bias to note is egocentric bias, in which egocentric means ‘towards the self.’ People consistently
rate their abilities as above average [16] . Can you see what is wrong here? By definition, most people cannot
perform above average because the average falls in the middle. Scientists have observed this effect in all sorts of
situations: self-assessed leadership qualities to the likelihood of getting cancer, and all sorts of people from
students to professors [16] .
Biases are important to understand here because they can lead directly to fallacies, and they may also support
erroneous beliefs.
Heuristics are mental short-cuts [1,17] . One example is the availability heuristic when we use the most easily
available information to solve a problem. We could also call heuristics rules of thumb: quick methods to solve
problems without sitting and doing a lot of math or logic. Heuristics are useful, but they lead to approximate
answers and lead us to get things wrong [17] .
In some situations, heuristics can lead to systematic biases. This is a serious issue for critical thinking because
people then ignore other relevant information .
So why do we have heuristics? Our ancestors needed fast, efficient solutions to problems during evolution, but
they required precisely correct solutions. They succeeded enough to survive; otherwise, they would not be our
ancestors. As a result, our brains happily hold on to errors, as long as what we know works well enough in
practice, even when our daily experience completely contradicts the heuristic.

Examples: Why They Are A Problem


Emotions can cause problems when they interfere with logical reasoning. Think about aphorisms that we use in
speech every day. For example, nobody wants to ‘let their heart rule their head,’ but perhaps they want to follow
their ‘gut instinct.’ Which is right?
The answer is not straightforward. Our emotional state can make a huge difference in how we perceive and
interpret incoming information. Moods are transient, so we regard decisions and conclusions made under highly
emotional conditions as unreliable. However, emotions are far more vivid to us than cold reasoning processes [18]
. Accounting for their influence is, therefore, a difficult task.
Fallacies of logic can occur without awareness, or they can be used deliberately as a manipulative tool. Both can
get in the way of us knowing the truth. Fallacies are a particular problem because we do not reason things out in
isolation. Often, the outcome of one reasoning process feeds into the next .
For example, a CEO might use critical thinking to work out a plan to expand into a new business area, beginning
by figuring out which products would work best, then moving on to selecting a team to lead the new venture, then
on to planning the expansion in more detail. If they fall into logical fallacies in the project’s initial stages, the
decisions may not be optimal, putting later stages at risk.
Cognitive biases and heuristics can make us believe the impossible. For example, given specific probabilities of
two events, logic enables us to work out both events’ probabilities. If event A is 50% likely to happen, and event
B is only 10% likely to happen, logically, you would expect people to realize that the two together are even less
likely than event B.
However, this is not what happens. Even medical professionals judging the likelihood of symptoms make this
mistake [19] . This example shows how pervasive our cognitive biases are and that they sometimes happen without
awareness [11] . Biases like this can easily affect people’s thought processes without them realizing it, leading to
unfortunate consequences like bad investments and misdiagnoses.

Top Ten Brain Twisters


In our business of separating sense from nonsense, certain fallacies are particularly relevant. We need to watch out
for these. Where possible, be careful not to commit these yourself in our everyday debates and discussions. Some
of these are quite subtle. Others are obvious when you know what to look for.

1. Ad Hominem Fallacy
Ad Hominem means "against the person." It means attacking the person rather than attacking their point or
conclusion [1,20] . You might witness this fallacy in a political debate.
For example, one politician argues passionately against a new shopping mall in the town, but their opponent
points out that they live in that town and the new mall would bring a lot of extra noise and traffic to the area. The
opponent argues that the first politician is therefore concerned for themselves, not necessarily for the residents.
Here, the first politician described a concept, but the other proceeded to attack the first as a person, ignoring the
debate’s topic. Attacking the opponent is not an effective way to argue against their idea, so we describe ad
hominem as a fallacy. Like the other factors described here, this fallacy can lead to divergence from important
topics. People sometimes use it deliberately to divert attention and discussion away from certain topics.
There are two types of ad hominem [21] . The circumstantial variety is when a source is speaking hypocritically,
and somebody else points it out. This type of ad hominem may constitute a legitimate argument, but it is still a
logical fallacy. The second variety is abusive ad hominem, where somebody uses another’s personal traits to
attack their idea, where the traits are unrelated to the idea.
In practice, ad hominem rebuttals are not always irrelevant. Let us think about a political debate. One politician
attacks the other’s personality or life choices. But what if these are relevant to the argument?
This example illustrates circumstantial ad hominem: the opponent points out the first politician’s hypocrisy.
Suppose the first politician had no obvious self-interest in canceling the new mall. In that case, the opponent could
still attack them to convince the populace that they were not trustworthy and discredit their opinion. This is
abusive ad hominem, a fallacy we should certainly try to avoid.

2. Hasty Generalization
Hasty generalization is another important fallacy that we need to understand. It means jumping to a conclusion
based on too little evidence. A more technical definition is generalizing from a sample or single instance to a
whole population. However, the sample may be too small or not representative of the general case.
Imagine a friend saying:
“My Grandpa lived to be ninety-six years old, and he drank a bottle of whisky every day of his life! ”
Unfortunately, Grandad does not prove that alcohol is a recipe for a long and healthy life. This anecdote, a single
example, does not outweigh decades of medical evidence.
Generations of thinkers have described this fallacy. Aristotle discussed it first, followed by many more scientists
and philosophers. Alternative names for hasty generalization include faulty generalization, the fallacy of accident,
the fallacy of neglecting qualifications, and many others [22] .
Hasty generalization is easy to commit. People under pressure in busy jobs, seen as authorities on the topic at
hand, might mistakenly conclude too early. Hasty generalization can also lead to wrongly assuming that every
instance is the same, based on one or two examples. It can also lead to people ignoring situations where their
conclusion is false. In the example of Grandpa and his whiskey, the speaker focuses on the single example at the
general case’s expense.
You can see how hasty generalization could become a serious problem and prevent us from getting to the truth.

3. Bandwagon Fallacy
The bandwagon fallacy means falling into the trap of thinking that the majority is always right. People commit
this fallacy when they agree with the majority without seeking further information [23] .
A classic psychological study revealed that many people would agree with the majority opinion even when they
can see that the majority is wrong [24] . This experiment’s task was shockingly simple: participants had to choose
the longest line from a few options, and the lines were different lengths. The experimenters put individual
participants in groups with fake participants, and all the fake ones chose a line other than the longest line.
Asch’s study showed that many people agreed with the majority but then expressed concern and confusion
because the majority gave the wrong answer. The experiment put people into an unnatural situation, but we can
also see the bandwagon effect in real-life scenarios.
In real life, the majority opinion is often fine, and we can choose to follow it without dire consequences [25] . For
example, most people would agree that dogs make good pets and rhinoceros do not. Choosing a pet is a relatively
benign decision, though.
In contrast, turbulent environments lead to more copying; the correct path is harder to discern in more ambiguous
situations [27] . Think about how this relates to a high-pressure business environment, where the situation may be
highly complex, to begin with, and changes rapidly. In these situations, organizations follow each others’ business
decisions more than in a calm and stable business environment [26] .
People and organizations jump on bandwagon for many reasons. They may genuinely believe it is the best option,
or they may see others they admire jumping on the same bandwagon, which gives that choice more credence [26] .
However, the bandwagon effect is a failure to apply logic and our own experience. Information about the
majority’s opinions and choices is easy to obtain and quick to process, but most are not always right. Even the
majority opinion of a group of experts is not always correct.

4. Straw Man Fallacy


People use the straw man fallacy to influence others. It involves changing somebody's point or argument to set up
an easy target, then knocking it down using your own arguments [1] . It is the logical equivalent of slapping the
person standing next to your opponent.
Straw man arguments are extremely common. Here is an example. Two politicians hold a public debate a couple
of weeks before a local election. Sam McAdams makes an announcement:
"We will not invest in waste disposal in the city. Instead, we will reorganize our facilities and raise efficiency by
200% in my first two months in office."
The crowd cheers, but the opponent has something to say.
"I cannot believe that Mr. McAdams proposes scaling back the workforce at Waste Disposal! "
The debate proceeds; McAdams tries to point out that he never suggested getting rid of staff.
People may deliberately set up a straw man to knock it down, but it can make a big difference. In this example,
the opponent set up the straw man to get McAdams to discuss a different topic. It certainly steered the debate and
probably had a significant effect on the spectators.

5. Confirmation Bias
Confirmation bias is a bias towards information that confirms what we think we already know. Take this example:
Jayshree firmly believes that all Hollywood actors over 30 years old have had cosmetic surgery. Every time she
sees somebody whose face looks smoother than last year, she points it out to her friends.
What do you think Jayshree says when she watches a movie and the actors look no different? Nothing, of course.
It is unremarkable that the actors have aged normally. Jayshree notices evidence that supports her belief, but she is
oblivious to the evidence against it.
Confirmation bias is extremely common, affecting what information we notice and what information we seek out
[28] . People have a strong tendency to seek out information that confirms their beliefs due to a strong desire to

maintain those beliefs [12] . Returning to our example, Jayshree might search the internet for ‘celebrity plastic
surgery’ information, but she would not be looking for information on who has not had plastic surgery.
When faced with a message, beware of confirmation bias. It is similar to wishful thinking: sometimes we believe
what we want to believe, and evidence supporting what we believe grabs our attention.

6. Anchoring
Anchoring occurs when we over-rely on the most prominent feature of a situation, person, or object. Anchoring
strongly affects our judgment and estimation [11] . This may be the first piece of information we encountered or
the information that we feel is most important. Anchors are mainly numerical. For example, someone taking out
car finance might choose to focus on the interest rate, displayed in large figures on the website, rather than
processing additional information.
Anchoring biases our judgments, but also things like estimates. If you go to a car showroom, you may have room
to negotiate. Nonetheless, your mind anchors your initial offer around the price quoted on the window. This is
known as anchoring and adjustment: the first number we see biases our subsequent thinking [1,11] .
Psychology experiments show that different anchor points can lead to vastly different decisions. Furthermore, the
anchor does not even need to be related to the question to influence a person’s answer [17, 29] . This shows that
anchoring is pervasive and, to some extent, automatic.
Anchoring is sometimes also known as a heuristic, and it does enable our minds to take a shortcut and stop
processing more information. However, it is sometimes automatic and, at other times, more conscious [11] .
Automatic anchoring is more like a suggestion: the anchor primes somebody’s estimate or choice by activating
similar numbers or ideas in mind, and the person experiencing this may not be aware of it.
On the other hand, deliberate anchoring is when you consciously adjust your initial estimate to get closer to the
real answer. This process is more controlled, but people typically stop adjusting too early, meaning the anchor still
biases their final response. We are more likely to stop adjusting too early if we are under time pressure or are
multi-tasking [11,17] .

7. False Consensus
This bias comes from social psychology, the study of personality and social interaction. False consensus focuses
on how we see ourselves relative to other people. Like the arsonist who might have once said, 'Well, everyone
loves to set fires, don't they?', we overestimate how common our actions or traits are in the general population.
This bias emerges when people hear about other people's responses [30,31] . Whether we read others’ answers to a
set of questions or hear about decisions made in a scenario, we see other people's responses as more common and
typical when they match our own. Conversely, we see others' responses as strange and uncommon when they
diverge from our own.
False consensus effects are larger when the question is more ambiguous. One study asked people specific
questions like ‘are you the eldest child?’ and more general questions like ‘are you competitive?’ The study
reported a much more pronounced false consensus effect with more generic questions [32] . This provides more
evidence for the effect and suggests that when people have more room to interpret the question in their way, they
perceive others as more similar.

8. Halo Effect
The halo effect is not about angels; think about the type of halo you see around a streetlamp in the mist. This bias
occurs when something is seen positively because of an association with something positive, like the light from
the streetlamp spreading out as it refracts through the mist particles. You could call this the ‘glory by association’
bias.
We all know that first impressions matter in our relationships. This bias is part of that. Our initial impressions of
people and things can create a halo, overly influencing what we think of them .
When people have to rate others on positive criteria like competence or intelligence, their ratings are influenced
by how warm and friendly they seem to be [33] . The halo effect even occurs for traits we know are unrelated, such
as height and intelligence.
As you can imagine, the same applies to objects and ideas. Companies like to use beautiful people and scenery in
their adverts and promotions because this gives potential customers a positive impression of the company and the
product.

9. Availability Heuristic
The availability heuristic affects us when we have to judge probability or frequency [12] . We assume things we
can imagine or recall easily are more common or more likely. Another way to conceptualize this is to assume that
the first things we think of are the most important [1,11] .
You can see how the availability heuristic can be useful. When deciding where to take a vacation, your first
thought is more likely to be somewhere you want to visit rather than an obscure destination you have barely heard
of. The desired destination is more available in your memory, as well as more vivid.
This heuristic draws on several characteristics of human memory [17,34,35] . Firstly, the recency effect: we have
better memories for recent events or things we have seen or heard recently. Secondly, we remember things that
make us feel emotional. Finally, we recollect personally relevant and vivid information far better than dry, boring
stuff. Any of these or all of them together can create high availability.
The opposite is also true. If you cannot think of many instances of something, you will think it is less common or
less probable. When researchers asked participants for a very large number of advantages of something, such as
their college course, they found it hard to think of enough. These students rated their course as worse than others
who had to think of fewer advantages [11] .
This example seems paradoxical at first, but not when you think of it in terms of availability. The course’s positive
aspects felt less common to those who were asked for more because they could not think of the full set of
advantages requested. This illustrates how the availability heuristic could be a problem, depending on questioning
techniques.
If we can call examples to mind easily, we think events are more likely to have happened before or to happen
again in the future. For instance, people worry that terrorist attacks are possible or even probable. A young
graduate’s family warns them against moving to New York, Paris, or London because of 'all the terrorists.' These
attacks are readily available to people's minds, so they feel that attacks are more likely than they are.
Availability is a useful heuristic because it allows us to make rapid judgments and decisions. People are more
influenced by availability when they process information quickly and automatically, for example, when feeling
happy or distracted [11] .

10. Representativeness Heuristic


The representativeness heuristic happens when we judge things according to how similar they are to things we
already know about [2,17] .
Representativeness operates when we have to judge probabilities. Specifically:
Categorizing items : the probability that this object, person, or situation belongs to a given category.
Origins : the probability that this event comes from a given process.
Projections : the probability that a given process will generate an event in the future.
Here is an example of the representativeness heuristic in action. Someone says:
"Ryan wears glasses, so I think he is a computer scientist rather than a farmer."
The speaker has a stereotype of a typical computer scientist in their mind, and Ryan fits that stereotype on one
criterion. Hence, they categorize him as a computer scientist (see [17] for the original example) .
When they use the representativeness heuristic, people are often extremely confident, although a vague
impression, rather than a range of evidence, determined their choice. They have not considered the percentage of
farmers and computer scientists in the general population, for example. Therefore, this heuristic is likely to lead to
incorrect conclusions at times, and it probably fuels some of our human failings, such as prejudice and
discrimination.

How To Un-Bias Your Brain


Beliefs and Emotions
The first step in dealing with erroneous beliefs is to use logic: look at what else must be true for the stated belief
to be true [1] .
Next, realize that some beliefs are simply false, and you can prove this by finding evidence against them. You can
easily disprove a friend who firmly believes that Martina Navratilova has won the most Grand Slams in women’s
tennis history. This type of belief is a mistaken fact, so you can assess it the same way you would assess a
purported fact or an unsubstantiated claim. You can perform a quick web search and find the correct answer from
a reliable source, such as the awarding body (the Women’s Tennis Association).
In contrast, other beliefs are not falsifiable. These may be acceptable on their own but at the same time
incompatible with other beliefs. In this situation, you need to assess each belief’s veracity and arrive at a new
understanding.
For example, people who believe the Earth is flat must also hold other, consistent beliefs. They must believe that
modern-day transport companies lie about the distances involved in traveling close to the North and South Poles.
If the world were flat, one of these poles would be at the center, and the other would be a loop around the edge of
the world. Logically, you would only need to disprove one of the beliefs to falsify the whole set.
Emotions are potentially more difficult to deal with. Instead of suppressing or ignoring your own emotions,
acknowledge that they exist and can affect your judgment. This simple change can remove some of their power
and help you avoid falling into the trap of rationalizing [1] .
Discerning emotions from facts are the major way you can avoid getting side-tracked by your core affect and
emotional content in messages and communications. Refer back to Chapter 1 for more details.

Fallacies, Biases, And Heuristics


To combat these three, we first need to acknowledge that we are human and our minds work in this way. There is
nothing intrinsically wrong with it. We can follow this with some principles of critical thinking :
Examine the facts of the matter : Make sure you consider everything that could be relevant and ensure that the
information is factual rather than beliefs or opinions.
Take a mindful approach : Realize that situations are constantly in flux. Get the best information you can at the
time.
Question everything : Only make assumptions when the facts are not yet available.
Compensate for biases : Sometimes this is straightforward, other times less so.
Draw your own conclusions : Ideally, these should be based on a full understanding of the available facts and in
the light of your own extensive experience.
To deal with the ad hominem fallacy, understand that circumstantial ad hominem is sometimes a valid way to
critique a person if their circumstances or traits are relevant to what they are arguing. Do not be tempted to
commit it yourself unless as a specific tactic to throw off your opponent. Abusive ad hominem, on the other hand,
converts a discussion into an exchange of personal attacks. When you encounter ad hominem arguments, point out
that attacking the person does not harm the idea and steer the discussion back to the facts.
If you want to make better decisions, reject hasty generalizations. Watch out for others jumping to conclusions
that are not justified. Make sure they specify any qualifications. For example, a source may tell you that mobile
advertising always generates revenue. Can this be true? There are probably some hidden qualifications here;
perhaps they mean that mobile advertising usually generates revenue when targeted at the right customers. Note
the change from ‘always’ to ‘usually’: absolute words like ‘always’ can cue you to other people’s over-
generalizations.
Avoiding the bandwagon fallacy appears easy, but it does involve more work than simply accepting what you see
or hear, like all of our critical thinking principles. Be alert and process the information actively, not passively.
Remain open to alternative information and solutions, embrace alternative perspectives, and realize that the
majority can easily be wrong. Keep scanning for new information, see information and evidence in context, and
not be tempted to over-simplify.
Remember that the bandwagon fallacy, and many of those discussed here, result from mental shortcuts. A mindful
approach helps to compensate for this and can be extremely helpful [1,26] .
The straw man fallacy is fairly easy to recognize. Be alert to the speaker’s arguments or claims. Watch out for
somebody restating an argument in their own words. Have they added or omitted anything? Have they changed
the argument?
Combat the influence of the availability heuristic and confirmation bias by going beyond your first thoughts and
encouraging others to do so as well. Acknowledge that your initial idea could be correct, but search for evidence
that disproves it. Keep multiple possibilities in mind. This gives you a firmer foundation going forward.
When dealing with numerical data, watch out for anchoring effects. Ensure you have enough time to perform
calculations in full to avoid mistakenly anchoring your final answer to an interim solution.
Be aware that completely unrelated values can easily anchor numerical estimates. Moreover, people and other
sources may intentionally attempt to anchor your responses, numerical or otherwise. A mindful approach helps in
these situations, and you could even try to re-anchor your estimates. For example, in a financial negotiation where
the other party has suggested an initial figure, you could contemplate a very different figure (even if it is only in
your mind) to compensate for the possible anchoring effect.
Anchoring may explain over-optimism in complex projects as we may overestimate our chances of success based
on success in the early stages. Accept that your best estimate may still be wrong, and follow a process of critical
thinking by assessing new information and evidence in full and integrating it into your ongoing projects.
False consensus affects how we perceive other people and their opinions and how they perceive us. Try to avoid
making assumptions about other people, and if others make incorrect assumptions about you, point it out politely.
In the professional world, the false consensus at an organizational level could lead to an array of problems.
Suppose managers at a company assume that their competitor companies all work in the same way as their own.
In that case, they could miss innovative opportunities by not absorbing different ways of working. Conversely,
they may not realize when they have a competitive advantage they could exploit. It is worth taking the time to
investigate the facts of the matter at hand.
Overcoming the halo effect means being objective in our judgments. Treat separate elements as separate elements:
for example, a beautifully painted scene on the side of an old, rusty car should not (of course) make you think it is
a good car. Be aware that a good experience of something has a lot of influence, sometimes more than it should.
Remember not to let the halo effect blind you to things (or people) becoming worse over time. Assess things
individually, on their own merits.
The availability heuristic could be particularly problematic for critical thinking. When we try to think rationally,
we must go beyond the obvious and examine situations in detail. The availability heuristic pushes us in the
opposite direction, but we can push back. For example, if you have to make a difficult decision or judgment call,
write a long list of pros and cons. This will help you focus on the whole situation rather than allowing the most
available information to hijack your decision-making.
The representativeness heuristic can be tricky to address because people are so confident in their judgments. To
prevent yourself from falling into it, find out the base rate of whatever you are judging. In the example given
earlier, the speaker judged someone to be a computer scientist rather than a farmer because he wore glasses. A
more reliable way to decide would be to look at what percentage of people work as farmers and computer
scientists and then choose the most common.
In general, finding out the base rate or baseline figures is an extremely useful way of dodging the
representativeness heuristic when judging which category belongs. When working with processes, you can gather
more data or repeat a process several times and observe the outcomes: a larger sample size is more likely to give
you a truly representative result.

Why You Should Never Be Afraid To Change Your Mind


The biases and heuristics discussed here make up a tiny fraction of those that philosophers, psychologists, and
economists reckon our brains work with. Remembering that these are a natural feature of how our minds work
and can be somewhat automatic, we need to challenge beliefs and ways of thinking without feeling threatened or
making others feel threatened .
In logic, true premises and valid logic lead to a true conclusion. If the ‘facts’ are wrong, the argument is not valid.
The conclusion may still be true by coincidence, but the argument given does not justify that conclusion. All the
factors listed in this chapter, and many more, can lead us to misapprehend the facts.
An argument’s premises may even be assumptions rather than facts, but this results in a weaker argument. An
assumption is something presented as factual but without evidence. Sometimes assumptions are the best we can
get in a given situation. For example, there are times when we simply cannot find any evidence one way or the
other. Scientists and other innovators run into this issue fairly often. Even so, you should have less confidence in a
conclusion derived from assumptions and gather evidence to support or reject the assumptions where possible.
Psychologists first defined the term ‘cognitive dissonance’ in the 1970s [36] . It has been a popular idea ever since
and has gained general acceptance [37] . When we come across new information that does not fit our current idea
or conclusion, our brains work to make it fit or reject the new information.
When contradicting beliefs lead to cognitive dissonance, our minds may try to hold on to both beliefs by
compartmentalizing them; our minds keep the beliefs apart. We do not suffer from the conflict and therefore
ignore it [1] . Instead, try inspecting the beliefs and see whether one or both need to be updated based on additional
evidence .
Note that cognitive dissonance is a rationalizing process. Our brains find it hard to hold contradictory information,
so facts may get distorted or ignored in the effort to return to a state of equilibrium.
When working things out for yourself, reduce dissonance by focusing on the process of logic rather than the
conclusion [12] . Make sure the logic is sound. As long as all the premises are true and the conclusion follows from
the premises, you can accept the conclusion that emerges from your reasoning process.
A further psychological trait that makes people susceptible to biases is that we have fairly poor intuitive math
skills. Even those with good academic and professional math skills do not always apply them in everyday
reasoning [12, 17] .
For instance, imagine a lottery that draws five numbers from a pool of fifty. One week, the winning numbers
come out as 1, 2, 3, 4, and 5. What is your first reaction to that result? Highly improbable, some people would say.
Perhaps so improbable that it suggests cheating.
These lottery numbers are just as likely as any other combination of five numbers drawn randomly from a set.
They do not represent random numbers very well; people expect to see numbers scattered across the range. It
would be far more surprising if the same set of numbers - any numbers - came up three weeks in a row.
In conclusion, you should never be afraid of new information and of changing your mind. Be open-minded, allow
growth in yourself, account for all the factors discussed here, and share your critical thinking skills with friends
and colleagues.

Other Cognitive Factors That Affect Our Thinking


In addition to the factors already discussed, other features of our cognitive system and the information itself can
affect our thinking.

Pattern Recognition
Our brains are incredibly good at recognizing patterns. People often perceive faces in facelike configurations, like
the Man in the Moon, known as visual pareidolia. A large area of our visual brain is dedicated to face processing,
so it is not surprising that we perceive them even when they are not there [12] .
Pareidolia is automatic: people do not try to see these patterns; they just do [2] . You have almost certainly had this
experience. Countless internet memes show objects like houses and cars that look like faces. Sometimes it can
take a few moments for the pattern to resolve itself into the image. Still, other times it strikes you straight away,
and it is difficult or impossible to go back to see the image as a less meaningful pattern.
Pareidolia can occur in other senses: hearing Satanic messages in music played backward or ghostly voices in
radio static .
Automatic pattern perception illustrates similar tendencies to optical illusions, like flat images that appear three-
dimensional. These are not just fun and games. Both pattern recognition and false perceptions could lead to false
beliefs, and people can and do seek information to support them.
In summary, our brains are incredibly good at recognizing patterns yet poor at statistics [12] . We regularly
perceive meaning in random stimuli.

Missing And Hidden Data


Information you are not aware of sometimes affects whatever you are trying to reason about. The danger is that
missing data could be crucial; if you had it, your conclusion or decision might be completely different.
In medical trials, missing data is common. For example, in a study of patients who have had a stroke, clinicians
might not be able to get data for all their research questions from all the patients. Some would be unable to
complete certain tasks, whereas others would. One way to account for this is to use statistics to fill in the gaps,
such as replacing missing data points with all the other patients [38] .
Researchers plan their clinical trials in great detail, usually building in methods to compensate for missing data.
You could consider doing this for your projects where applicable: plan how to compensate for unobtainable data.
Additionally, hidden information might result from confirmation bias when people ignore or fail to report
occurrences that disprove an idea [2] . Discovering these occurrences is vital if we want to undo the confirmation
bias.

Regression To The Mean And The Hot Hand


Although this is not a math test, you should be aware of regression to the mean. This is neither a fallacy nor a bias
but a characteristic of data. Regression to the mean occurs when somebody repeatedly takes a test or performs a
task. A very high or low score, an outlier, may occur, but then the data goes back towards the previous average
[1,12,17] .

This phenomenon explains why a great year for a sports team is more likely to be followed by a worse year than
by another great year. Performance improvements can and do occur, but we cannot judge a single great year as
though it reflected an average improvement. Excellent performance is a combination of baseline ability and
random good luck [12] .
Regression to the mean can have interesting effects in the real world. One scientist worked with military flight
instructors, one of whom reported that when he praised a cadet’s performance, they usually did worse the next
time [16] . The instructor thought that praise made people worse at flying airplanes. However, their particularly
good flight was an outlier, resulting in the cadet regressing the mean on their next performance.
Similarly, extremely poor performance is more likely to be followed by an average performance than by another
dismal one.
Generally, we have low awareness of the regression to the mean effect because we fail to account for chance as
much as perhaps we should. Regression to the mean also feeds into some of the biases and heuristics already
discussed [17] . The next fallacy illustrates a similar point.
The hot hand fallacy is the belief that following good performance; subsequent attempts will also be successful.
Commentary on team sports like basketball sometimes cites this fallacy [2] .
It is related to regression to the mean: regression to the mean is the real situation, whereas the hot hand fallacy is
what people think will happen. It is also related to confirmation bias: people notice when the hot hand effect
happens but do not notice when it does not [2] .
The Hot Hand fallacy can also apply to casino games, which players sometimes perceive as non-random. Casino
players exhibit the opposite fallacy: the gambler's fallacy that because they have lost many times, a win is due [39]
.
These gamblers’ fallacies suggest that even when a random chance is the main factor affecting the outcome,
people persist in perceiving patterns. Imagine what our brains might be doing when it is not so obvious that
chance determines the outcome!
Action Steps
Our brains do a great deal of information processing that we are not always aware of. We are quite fortunate to
have all these short-cuts making processing more efficient. Try these suggested exercises to explore these ideas
further before we move on.
1. Fantastic Fallacies, And Where To Find Them
Find a list of fallacies, biases, and heuristics in an online encyclopedia or psychology website. See how many
there are? Read some of them and make a note of your thoughts. You could look at things like:

Which ones might people be unaware of when they encounter them?


Are they rhetorical devices used deliberately to persuade? Or are they quite automatic?
Are they related to those we have talked about?
Can you think of examples from the media or from your own life that fit the definitions?
How might you combat these in your own or others’ thinking?

2. Un-attacking The Perso n


Find an argument, such as a transcript of a debate, in which someone uses the ad hominem or straw man fallacy
deliberately to divert the debate. Rewrite it (or part of it), staying focussed on the actual topic. Compare your
version to the original, perhaps show it to a colleague or friend, and think about which version reaches a more
logical outcome.

Summary
At the beginning of this chapter, the story illustrates that sometimes we get it wrong; sometimes, this applies even
when we exercise good critical thinking skills. Our cognitive processes may be sophisticated, but they are also
economical. In the story, the parents believed they were benefiting their son by playing Mozart because they
believed the high-profile research paper suggesting that Mozart made people more intelligent.
The parents only read the initial research study on the Mozart Effect. They did not follow it up: hasty
generalization. They did not realize that other scientists had found it so hard to replicate the Mozart Effect. They
fell into confirmation bias by only noticing media reports praising (and confirming) the Mozart Effect.
The halo effect may have operated too because Mozart is generally accepted as one of the best classical
composers. If it had been an obscure composer, would the paper have gained such a high profile? The population
found it easy to fall in love with the idea that Mozart's music was special in yet another way.
Nor were the parents skeptical; if they had been, they would have researched the effect for themselves rather than
taking it at face value. Scientists aim to be skeptical at all stages of their workflow, from ideas to analyzing the
data from completed research. The next chapter elucidates scientific skepticism in greater detail.

Takeaways
1. Our minds abound with fallacies, beliefs, emotions, biases, and heuristics, all of which impact our perceptions
and how we process information.
2. These can have massive effects, so we need to remove their effects if we want to reach solid conclusions and
make good decisions.
3. It may not be possible to overcome these biases induced by our minds completely, but critical thinking can
help.
3

WHY HAVING A SCIENTIFICALLY SKEPTICAL MIND HELPS YOU DISCOVER THE


TRUTH

ifteen-year-old Alanna Thomas burst into tears and buried her face in her hands.
F “I’m so sorry,” she gasped. She looked up at the police officer standing over her. “I did it, I did… I pushed him
off. I’m sorry...”
On the other side of town, local journalist Lin Rodriguez also buried her head in her hands. She needed to get this
article finished, but the story was so complex. It was hard to know what was real.
Two weeks prior, Mr. Gomez, a science teacher at Mildenhall High School, was floating face-up in a flooded
disused quarry. Lin remembered his classes. He was strict but somehow still inspiring. She would never have
studied forensic sciences at College if it were not for Mr. Gomez.
Not everyone had liked him at high school, but Lin could not imagine why this local academic had fallen to such a
violent death. Events like this did not happen in their small town; the community was in shock. Naturally, the
rumors began as soon as the news broke. Murder? Suicide? Misadventure? Nobody knew, but everybody was
talking about it.
Lin’s boss sent her to the scene as soon as he heard, and she interviewed the forensics team as they painstakingly
collected evidence. They had covered the body, but the lead investigator told Lin that Mr. Gomez had some
suspicious bruises. They found two different sets of footprints around the top of the cliff too.
The next day, further evidence came to light. A local man told police he was walking his dog in the area the
previous night and had heard somebody making their way through the undergrowth not far from the cliff. The area
was overgrown with brambles, and he could hear they were having some difficulty. He reckoned this was not long
after Mr. Gomez had his lethal fall.
Lin asked around to find out who might know more. If it was suicide, perhaps Mr. Gomez had expressed sadness
or pain in the days and weeks before his death. She questioned colleagues at school and heard a few interesting
morsels of information.
Four separate people highlighted the same concern: a small group of students appeared to have a rather nasty
grudge against this particular teacher. They even reported social media threads detailing certain students’ fantasies
about playing nasty tricks on him, like keying his car or even harming him personally. These groups consisted of
students the other teachers agreed were outcasts. One of the students was Alanna Thomas, a shy girl who was a
local attorney’s daughter.
Lin investigated the social media posts and found several distressing threads. Sure enough, ‘let’s kill Mr. Gomez’
came up more than once.
The problem was, Lin just could not believe that any of these disaffected children would murder their teacher.
Priding herself on her skepticism, she looked for and found an alternative explanation.
Alanna Thomas’ father was aiming for a promotion: he wanted to become a district attorney. Furthermore, a
group of powerful local business owners was firmly against this idea. Mr. Thomas was a keen environmentalist,
and everybody expected his appointment to scupper their plans to build a large power plant on the edge of
Mildenhall. Instead of an angry schoolgirl, it was surely more likely that somebody had hired a professional killer
to neutralize Mr. Thomas by implicating his daughter in a murder case.
Besides, the child was the perfect stooge. She was known to hold a grudge against her teacher and be a social
misfit who would crumble under police questioning.
As we have seen, that is exactly what happened. Alanna’s tearful confession formed the backbone of the case
against her. She was easily tall and strong enough to have pushed Mr. Gomez off the cliff while he was out
walking his dog at night, a habit which the whole town knew about.
Lin published her investigation. Following Lin’s article, the police dropped their case due to a lack of evidence.
Officially, they concluded that Alanna’s confession was unreliable and that there was not enough evidence. Mr.
Gomez had fallen into the quarry; it was a terrible accident.
On the same day, Lin received an anonymous email. It said:
“You should have listened. Gomez was murdered.”
The sender attached a high-resolution photograph taken from the top of the cliff. The time and date were exactly
right, and so was the location data. The image showed Alanna Thomas standing at the edge and down below the
body of a man face down in the water.
If Lin had been properly skeptical throughout her investigation, she would not have suffered these dire
consequences. She doubted the first explanation–that Alanna had killed her teacher–so much that she came up
with an even less plausible alternative. She convinced herself and others that it was true, even though her
conspiracy theory had less evidence to support it than the police’s theory. Ultimately, the truth eluded everybody.
Skepticism is not simple cynicism. Skeptics keep an open mind, doubting every explanation rather than believing
they have arrived at a final answer. Taking a skeptical approach based on scientific principles helps us get closer
to true conclusions rather than settling for what we want to be true [1] . Critical thinkers can guard themselves
against being misled into believing lies or mistaken information.

What Is Scientific Skepticism?


In general usage, skepticism refers to an attitude of doubt. Skeptics in the media often criticize ideas they see as
unlikely, such as alien abductions or conspiracy theories. Scientific skeptics are prepared either to believe or
disbelieve claims, depending on a fair analysis of the evidence.
Scientists, as natural skeptics, spend many years gathering evidence before publishing their findings. What is
more, scientists never claim to discover the truth; they just update the current understanding. There is no end to
scientific inquiry.
You can only apply a scientifically skeptical approach to claims that are verifiable and falsifiable. ‘Verifiable’
means that you can test the concept or claim [40] . In the early 20th century, European philosophers spent a lot of
time thrashing out what verifiability means. Something is verifiable if you can find out that it is true by observing
or measuring it. Some philosophers include logical verification within this definition, but scientists prefer to focus
on claims that they can test in the real world. These include whether tooth decay predicts tooth loss or whether
investing in education correlates with an improved local economy. Science focuses on things we can measure,
usually quantitatively.
To be clear, you do not have to prove the idea for it to be verifiable. For example, we can say that life may be
discovered in other solar systems in the future, although we do not possess many methods for doing so at the
moment. This claim is verifiable because we would be able to travel there in the future and observe whether life
exists or not. Current science measures planetary environments by using telescopes to detect atmospheric
chemistry, which can reveal life conditions and may reveal chemical signatures of life [41] .
An example of an unverifiable claim might be: ‘a child once swallowed a whole bicycle wheel and survived.’ We
cannot verify this because we do not have the information to identify the child. Anecdotal reports like this may
simply be mistakes or deceptions. Even if the source is reliable, they may pass on unreliable information, so you
should be particularly skeptical about second-hand evidence [2] .
Claims that we cannot verify are, therefore, not scientific. Instead, we can call them ‘metaphysical’ in the case of
faith-based claims [42,43] or refer to them as beliefs or unverified claims. The problem with verification is that it is
not often that we can 100% verify anything [43] . That is why falsifiability is so important and has somewhat
eclipsed the concept of verifiability in recent decades .
Falsifiability means that it is possible to disprove a claim or proposal [44] . For instance, if somebody stated that
Robert de Niro was born in New Zealand, you could falsify the statement by obtaining evidence from birth
records.
This criterion is more powerful than the verifiability criterion because it is easier to find ways to disprove claims
than prove them. Falsification is the usual principle for modern scientific research: scientists, rather than trying to
prove something, try to reject its opposite. Falsifiability also helps scientists to maintain objectivity while
conducting and analyzing research.
Scientists engage in two broad types of research: observation and experiment. Both of these have their merits and
can be used to investigate claims. Observational studies are excellent for gathering initial evidence on a topic. In
an observational study, the scientists do not alter anything about the situation they are studying. They simply
record data and categorize it to see whether two or more situations differ.
For example, a psychologist might think that women spend more money on sun protection cream than men in hot
weather. To conduct a study on this topic, they would compare sun cream sales between men and women across
various temperatures.
However, they are not trying to verify their idea that women spend more on sun cream. They would be testing the
idea that sales were no different to see whether they could falsify it. If the money spent was different, they could
conclude that perhaps they were right and do more studies. If the money spent were no different, they would have
to accept that result. Perhaps women and men do not differ in reality, or perhaps some study aspect affected the
results.
Observational studies can be very informative, but experimental studies are more powerful. In an experiment, the
scientist controls as many variables as possible, aiming to keep everything equal except the variable of interest.
As an example, you might want to design an experiment to see whether giving salespeople bonuses led to more
sales in the following year. To do this, you would have to give some people a bonus and others no bonus and
measure their performance. (Perhaps you could give the non-bonus group a bonus later on).
Another key scientific principle is replicability. Do we find the same result when we repeat the same research? If a
research result is a one-off, it is not reliable. The idea that listening to Mozart made people more intelligent was
difficult to replicate [8] , even though one study had found this result [9] .
One difficulty with adopting a scientific approach is that people tend to prefer positive results. We prefer to draw
conclusions based on events instead of non-events, reflecting our preference for meaningfulness over randomness
[2] . Sometimes, people like a scientific result so much that they ignore other results that falsify it, much like

confirmation bias [1, 10] .


Science uses observation and measurement; therefore it mostly uses inductive rather than deductive reasoning.
Deductive reasoning is when you argue from the general case to the specific case. If all penguins can swim, and
Benjamin is a penguin, it follows that Benjamin can swim.
Inductive reasoning flows in the opposite direction. You argue from the specific to the general case. If every
penguin you see is black and white, you conclude that all penguins are black and white. Seeing a single blue
penguin falsifies the statement.
In the penguin example given above, seeing a blue penguin would force us to change our conclusion to ‘most
penguins are black and white, and some are blue,’ which would be acceptable until we get further information.
Perhaps somebody discovers a new location filled with blue penguins, tipping the balance so that we change our
conclusion again, now stating that ‘most penguins are blue.’
Claims based on faith fall outside the domain of science because they are neither verifiable, nor falsifiable. This
does not mean they are false, simply that we cannot investigate them using the scientific method. We cannot label
religious beliefs as true or false.
Scientific anomalies like dark matter and dark energy are somewhat similar in that people debate whether they
exist and how to explain to them if they do exist. However, in these cases, we can say that we are awaiting an
explanation, and scientific methods can potentially explain them (since both matter and energy are core topics for
physics).

How Critical Thinking And Scientific Skepticism Work Together


Scientific skepticism works together with critical thinking to help us discern truth from non-truth. The techniques
of scientific inquiry involve examining all the evidence before concluding. This process makes us less error-
prone. Neither scientists nor skeptics are completely free of error and bias, but aim to be as objective as possible.
As you might imagine, it is often impossible to examine all the evidence. For example, a team of biologists cannot
dissect every single member of a certain species to examine their inner workings, and social scientists cannot
expect a 100% return rate for their questionnaires. In these cases, scientists calculate how many individuals’ data
points are likely to give a fair representation of the entire population, perhaps perform a smaller pilot study to
check, and use a sample of that size for their study.
When analyzing data, we have to beware of false positives and false negatives. Like a diagnostic test for a disease,
a scientific experiment cannot be 100% accurate; chance factors can intervene and create a rogue result that does
not reflect the reality of the situation .
A false positive result happens when an experiment randomly shows that something does happen or does affect
something else, but the result was actually due to chance. In contrast, a false negative result is when the effect is
real but, again by chance, did not show up in the experiment. The danger in both cases is that investigators might
accept the false result and either miss something important or proceed to investigate something unimportant [45] .
We do have ways to combat false results. We can repeat the same experiment several times in different
circumstances to examine replicability. We can also use control conditions in experiments to reveal what happens
under slightly different circumstances [2] .
Applied sciences like medical research may use placebo as a control condition. A placebo is an inactive substance
(or other types of intervention) that resembles the active treatment enough that people cannot tell them apart. The
researchers tell participants they may receive the placebo or the active treatment. Due to ethical concerns around
withholding something that might benefit sick people, new treatments are often tested against the current
treatment. Similar to the example given earlier, deciding not to provide some salespeople a bonus as part of a
research study would be unfair.
Critical thinking creates a framework of doubt: like scientists, we question things constantly and gather evidence
to draw more reliable conclusions. Like a scientist, try to avoid falling into the trap of thinking you have
discovered the absolute truth. It is better to remain open-minded and flexible and update your current
understanding by adding new evidence as you discover it. At the same time, remember that some theories have
better logic and evidence to support them than others [1] .
If you want to judge whether a theory is a good scientific theory, apply the following principles:
Falsifiability : recall that claims that we cannot disprove are not scientific, and we should label them as beliefs or
assumptions rather than theories.
Occam’s razor : if we have two or more competing theories, we should use the simplest one (until we have more
evidence). Some thinkers state this as accepting the theory that forces us to make the fewest assumptions. Also,
theories should not contain extra elements that make no difference to the whole [2] .
Explanatory power : a good theory should adequately explain its subject better than competing theories. New
evidence can render it a better or worse fit for the data. A theory can also be thrown out in the light of new
evidence, in which case its explanatory power drops to zero [46] .
Predictive power : this to the theory’s ability to generate testable predictions. For instance, in chemistry, the
periodic table of the elements predicted elements that did not exist yet, which scientists then discovered or made
in the lab. Similarly, Darwin predicted a pollinating moth for a specifically shaped flower, and botanists later
discovered a moth that fit his prediction [47] .
To illustrate these points, imagine a friend telling you this tale:
“My Grandad's liver held out until he was ninety-six years old, despite drinking a bottle of whisky every day of
his life!”
Unfortunately, Grandad does not prove that alcohol is fine for your liver. This idea is falsifiable if you look at the
whole population of drinkers; Grandad was atypical. This story also illustrates confirmation bias, as though a
single example outweighed decades of medical fact. Occam’s razor tells us that Grandad is the exception. His
good luck does not bode well for other heavy drinkers: your friend’s theory has low explanatory and predictive
power.
With so many considerations at play, we need to continue to be skeptical of conclusions even when we feel
confident. This does not mean we should doubt others or ourselves to excess, only that we should remain open to
changing our minds.

Why We Need To Be Skeptical


One theory states that our minds are hardwired to make decisions, and once we decide, we are reluctant to change
our minds. This is because we have two different ‘actors’ or systems within our minds that process information.
We can call these the fast system and the slow system [11] .
Have you ever wondered why some experts seem to be able to make decisions and solve problems instantly? One
explanation for expertise is that well-studied skills become virtually automatic with practice. This includes
thinking as well as physical skills. These rapid mental actions use the fast system, which also deals with
recognizing emotions from people’s voices and automatic reading of whole words in fluent readers.
The slow system does deliberate thinking, such as when we have to do a difficult calculation. You may observe
your mind needed to concentrate, effortfully retrieving information from memory, and working through the
problem in sequential stages. This system is highly affected by distractions, which is why sometimes you might
find yourself concentrating so hard that somebody has to say your name several times to get your attention.
The fast and slow systems work together, too. The fast system recruits the slow system to help with difficult tasks,
and sometimes we all experience the conflict between automatic and effortful processing. One example could be
resisting the urge to criticize somebody if you get angry: the fast system drives the hot-headed emotional
behavior, but the slow system keeps it in check.
Like skills, prior beliefs, and things we think we know can become automatic, almost like mental reflexes. If we
want to overcome them, we need to make a significant effort, and even then, the automatic ‘gut reaction’ can
remain .
That is why we need to continue to be skeptical of our conclusions even when we are confident. We must be wary
of our mind’s flaws, including tendencies like being overly influenced by other people and our own emotions and
prior beliefs, and our inherent biases and the brain’s preference for taking shortcuts.
However, the good news is that we can change our cognitive habits through practice. We can even see changes in
the brain with practice. Neuroplasticity means that our brains can reorganize themselves quite extensively, and
this is not only the case for younger people whose brains are still maturing. ‘Map reorganization’ in the brain is
particularly interesting. Research shows that learning and practicing skills lead to growth in related brain areas,
whereas dropping the practice leads these areas to shrink back towards their baseline size [48] .
It can be difficult to challenge other peoples’ reluctance to change their minds. Throwing a lot of facts and
evidence at them may only make things worse. Instead, while maintaining awareness of any cultural and social
factors that might be feeding into their opinions, try coaxing them to consciously think about their attitudes (use
the slow system of thought) and remind them why evidence is important [49] .
It may sound contradictory, but we need both skepticism and open-mindedness [50,51]. We can define open-
mindedness as a set of mental habits:

Thinking flexibly and avoiding rigidity.


Accepting views that may contradict each other, at least until you have evaluated them.
Avoiding getting blinkered by your own beliefs.
Striving to avoid bias even when you disagree with a claim.
Being willing to explore new ideas.

To some people, these ways of thinking may seem incompatible with skepticism. Certainly, the everyday
definition of skepticism focuses more on being critical and challenging ideas than being open to them. A skeptical
person may appear to be closed to new ideas until they obtain further evidence, in contrast to the open-minded
stance portrayed above. But recall that a key part of skepticism is being open to changing your position and seeing
other perspectives.
Opponents of a skeptical approach may argue that skepticism is paradoxical because skepticism itself is a belief
system . These opponents argue that skeptics use ad hominem, straw men, and similar techniques to discredit
potential miracle discoveries [50,52] . However, true skeptics take a balanced view and are always open to the idea
they might be wrong. ‘Pseudo-skeptics’ are those people who almost exclusively disbelieve and deny claims; they
are similar to ‘debunkers’ whose mission is to try to disprove claims [52] .
To summarize, think of skepticism and open-mindedness as two complementary aspects of the same process:
critical thinking. They are not incompatible. Note that an open-minded attitude allows you to salvage the good
parts of bad ideas, whereas a strict skeptic would throw everything out. Open-mindedness is a key part of
creativity and innovation.
Lucidity And Metacognition
Lucidity is an open-minded state where we can see past our prior beliefs and perceive reality as it is. This is a
great state to aim for if you want to appreciate new information fully.
People have an inbuilt immunity to new ideas and prefer to stick with what they know. People may even perceive
new ideas as threatening, making a kind of automatic assumption that they already believe must be better than the
novel claim. That is why discovering the true cause or process of something is only the first step. Scientists must
then continue to investigate and try to convince others that their theory is correct [53] . At the same time, they must
maintain a skeptical viewpoint, acknowledging that they might be wrong.
People have trouble with new ideas, particularly new scientific ideas, for a few reasons. Firstly, the true causes of
phenomena are not usually simple or obvious. Secondly, people often get the causes wrong when considering
things they feel strongly about. Thirdly, it is difficult to discover how to get to the correct explanations (that is
why science is always seeking to improve scientific methods and knowledge). Fourthly, people need to be
continuously motivated to discover the real causes and, subsequently, to promote novel explanations [53] . You can
see why scientists are such busy people.
Evidence shows that high critical thinking skills are associated with high metacognitive awareness [54] .
Metacognitive awareness means having awareness and control over how you process information; it is a self-
reflective process, known as ‘thinking about thinking’ [55] .
We can use metacognition to guide our own learning, development, and decision-making. People with high
metacognitive awareness have excellent knowledge about their own cognitive processes and their outcomes. Like
critical thinking, metacognition is teachable and can improve with practice [56] . For example:
John knows he has a poor prospective memory - he always forgets to do things he has said he will do. His family
and colleagues often get irate about this. However, John has excellent metacognitive skills: he knows his memory
is poor, enabling him to do something about it. He trains his memory by setting reminders and writing task lists.
After a while, he does not need to set reminders anymore; he goes straight to the lists.
You can imagine that somebody with poor metacognitive skills might not have been as successful. John was not
afraid to admit he had a minor memory problem and was able to solve it .
Interestingly, student teachers with more experience showed higher metacognitive awareness and critical thinking
skills (assessed by questionnaire) [54] . This was a correlation, so we do not know whether metacognition causes
critical thinking or the other way round; alternatively, they may draw on the same underlying skills and habits.
Since critical thinking means deliberately using sophisticated thinking skills to solve problems, going beyond
intuition, and using high-level analytical skills, it seems reasonable to suppose that it relates to metacognition.
Paul and Elder [57] describe nine intellectual standards that should help us think both lucidly and metacognitively
about ideas. These are standards that scientists strive to meet in their communications, and they give you a helpful
framework whether you are composing an argument or receiving one from another source:
Clarity : to reason about a claim, we must be clear about what they mean. Therefore, when you are
communicating, you need to aim for maximum clarity as well. This standard is a prerequisite for all the other
standards.
Accuracy : you may not have access to resources to check the accuracy of all points made, but you can assess it by
thinking about whether the claim is verifiable and whether the source is trustworthy.
Precision : information should be appropriately precise for the point under discussion. A claim could be accurate
but imprecise; for example, ‘the company’s profits fell last year’ is less precise than saying they fell by 18% last
financial year.
Relevance : we might reason clearly, accurately, and precisely, but this is pointless if we deviate from the core
topic.
Depth : this means dealing with the complexities and relationships of the concept under discussion rather than
over-simplifying it.
Breadth : this means taking in multiple (relevant) points of view and recognizing alternative perspectives on the
issue. For example, business strategies often look at environmental, ethical, and social concerns, as well as
economic factors
Logic : this means ensuring that the arguments work logically: does the evidence lead to the conclusion, and does
the argument have internal consistency?
Significance : this is related to relevance, but sometimes relevant points are trivial. We need to ensure that our
reasoning focuses on the important aspects of the problem.
Fairness : our reasoning should be bias-free and honest. We should aim not to argue only for our own interests.
Others may interpret unfair arguments as attempts to manipulate and deceive them.
Hopefully, you can see how these standards relate to scientific skepticism and communication. All of these
standards apply to science but also our everyday lives, both work-related and personal problems. Therefore they
are useful to remember when composing or reading claims and other communications.

Looking Beyond Our Prior Learning


Scientific skepticism is not always easy. We can only reach the truth if we work hard to see past the received
wisdom and assumptions that society taught us in our youth.
The postmodern view says that truth is not absolute but subjective. Declarations are therefore always up for
debate. As any scientist or critical thinker will tell you, all we have is our best current understanding. Truth is
constantly evolving in the light of new evidence. Postmodernism goes much further than this.
Postmodernism is not a single theory but a way of looking at things. Scholars have applied it to many different
domains, mainly literature, the arts, theology, and philosophy [58] . However, here we are concerned with scientific
skepticism and how to get to the truth, focusing on postmodern views of science and philosophy.
Postmodernism consists of the following key ideas [59] :
1. There is no objective reality outside of human experience.
2. Scientific and historical 'facts', therefore, cannot be true or false because humans concocted the very idea of
reality .
3. Science and technology cannot change human existence for the better. (Some postmodernists believe science is
a dark force rather than a way of humanity progressing).
4. Reason and logic are not universal; they are only valid in their own domains.
5. All (or nearly all) human nature is socially acquired rather than hard-wired.
6. Human language does not reflect reality directly; instead, it is completely fluid and only reflects how people
refer to things within their own cultural and historical context.
7. We cannot gain knowledge about reality, and nor can we back up our knowledge using evidence or logic.
8. We cannot formulate grand theories that explain wide-ranging phenomena; postmodernists believe these are a
kind of totalitarianism that disallows other views.
As you can see, postmodernism contains some useful ideas. However, it is difficult to take a completely
postmodernist view and still expect to explain anything or figure anything out. You could say that postmodernism
fails to explain anything but at the same time, claims to offer an alternative to traditional scientific methods.
Paradoxically, postmodernism decries grand theories but is itself a grand theory [58] .
Postmodernism was most popular in the mid to late 20th century, particularly the 1990s, when academics
collectively published over 100 articles per year [58] . The postmodern movement provoked a great deal of popular
debate. Critics of the approach say that it encourages people to think of science as no more useful than
pseudosciences like astrology [59] . (Although we know that scientific good theories predict future events,
astrology does not have this power.
Some postmodern ideas support skepticism and open-mindedness, but its core suggests that we can never discover
anything because reality does not exist. Mainstream scientists and philosophers alike seem to have more faith that
we can discover the truth, but postmodern attitudes persist [60] . On the other hand, postmodernism does
encourage us to take a broader view of ideas and look beyond traditional categories, so it is similar to the idea of
scientific skepticism (even though postmodernism is skeptical of science!).
Scientific Revolutions
Thomas Kuhn was a philosopher and scientist who wrote about how science moves forward. His work heavily
influenced the postmodern view, but he did not argue that science is anti-progress. Instead, he said that we do
‘normal science,' and knowledge moves forward in jumps, which he called paradigm shifts. ‘Paradigm’ refers to
the prevailing world view or scientific approach of its time. For example, history saw a great paradigm shift away
from classical Newtonian physics when Einstein advanced his theory of relativity [ 1,61] .
Normal science is an incremental process. Small advances taken together, debated by scientists in journals and
conferences, gradually increase knowledge. Scientists predict many discoveries in advance during normal science,
based on theories that they believe have solid foundations. Education imparts received wisdom to budding
scientists, and they become fluent in its specific methods and language and continue research along the
established lines.
A paradigm shift results from a crisis in science. The existing theory can no longer explain observations, or a
radical new theory gets proposed that explains things better than the old one.
Examples of paradigm shifts in science:
1. Copernicus’ proposal that the Sun, rather than the Earth, lay at the center of the Solar System.
2. Lavoisier’s discovery that chemical elements combined to make molecules with various properties, superseding
alchemical views of chemistry.
3. In the 1880s, the ‘germ theory’ that tiny organisms (rather than bad air) caused diseases.
A paradigm shift means a change in what scientists study and how they study it, and how society views that topic,
what methods we use to investigate it, and what conclusions are acceptable. These are huge shifts, hence the
alternative term: scientific revolution .
So what fuels paradigm shifts? There are three major influences.
Firstly, anomalies. Scientific anomalies happen when scientists find things they cannot explain. If enough of these
happen, a new idea could gain momentum and lead to fundamental changes (a paradigm shift). Small anomalies
may occur in science all the time, but nobody is looking for them to not be perceived or recorded.
Secondly, new technology (ways of measuring things) can fuel paradigm shifts. For example, medical imaging
techniques to psychological sciences led to the new field of functional brain imaging in the early 21st century.
Finally, when a new paradigm appears, scientists need to compare the new and old paradigms with each other and
with observations. Some may be looking to verify the new paradigm and falsify the old one; others will do the
opposite. Everybody works to find out which theory fits the facts better. They do 'extraordinary science' to see
what is going on and rewrite the textbooks. Extraordinary science helps to complete the paradigm shift from old
to new.
So what happens afterward? We might casually call outdated science incorrect, but it was fine in its own time.
Outdated science took steps toward the truth, and the new science grew out of the old. The discoveries made
might still stand but get interpreted differently under the new paradigm .
Science keeps going between paradigm shifts because people like to solve problems. Even if the progress is slow
and piecemeal, new research is important. Scientists may work within the established boundaries or try to push
things forward slightly all the time. As long as their approach is similar enough to their contemporaries, their
results comprise mainstream science.
Kuhn’s critics proposed that science does progress between the paradigm shifts. For example, Einstein’s theory of
general relativity began as a theoretical description. Later, other scientists found empirical evidence and general
relativity led to a wealth of knowledge and technology that we would not have had otherwise.
Where postmodernism gets interesting is in its applications to real-world settings like management and education.
A postmodern approach in these areas fosters an open-minded attitude: if the establishment is no more correct
than anybody else, everybody's ideas are potentially valuable. If there is no objective truth, a new business process
or teaching technique is never guaranteed to succeed, nor is it guaranteed to fail [58] . That is quite liberating.

Action Steps
We have examined scientific skepticism in detail, with the aim of helping us get to the truth. Why not have a go at
these optional exercises and apply some of the ideas we have discussed?
1. Opening The Mind
Write a skeptical and open-minded proposition or theory of your own. It may be helpful to use something trivial
for this practice exercise. It can be as simple as ‘Why I should get my driveway resurfaced this summer,’ or ‘An
explanation of why I choose not to dye my hair.’ Use the following helpful habits of mind [2] :
a. Gather as much evidence as possible. For instance, what is the current state of your driveway, and what are the
risks of not getting it resurfaced?
b. Beware of false positives and false negatives in the evidence. For example, you might read that driveway
surfaces typically fail after five years, but check who wrote this and what they base it on, and see what other
sources say.
c. Think broadly: consider everything that might possibly impact the proposal or theory. This might include
personal finances, the broader economy, environmental concerns - whatever factors are most relevant to your
proposal.
d. Consider what somebody with the opposite opinion to yours would write: how they would explain it and/or
what they might decide. This will help you maintain an objective perspective .

2. Metacognition Exercise
It is normal and natural to be resistant to changing our minds, but we learned here that reflecting on our own
cognitive habits can help enhance them. Use this quick questionnaire as a self-reflection exercise, or rate
somebody who you know well. Adapted from Snelson [53] .
a. How would you rate your ability to accept any new minor idea with a lot of evidence to support it?

I accept it immediately
I accept it after doing my own research
I need a few weeks to absorb the new idea
I do not change my mind over minor ideas

b. How would you rate your ability to accept any new major idea with a lot of evidence to support it?

I accept it immediately
I accept it after doing my own research
I need a few weeks to absorb the new idea
I do not change my mind over major ideas

c. How would you rate your ability to accept any new revolutionary idea with a lot of evidence to support it?

I accept it immediately
I accept it after doing my own research
I need a few weeks to absorb the new idea
I do not change my mind over revolutionary ideas

3. Standard Process
Analyze an article to check whether it meets the intellectual standards suggested by Paul & Elder [57] . Choose
something like an editorial discussing a controversial topic. Is it:

Clear?
Accurate?
Precise?
Relevant?
Deep?
Broad?
Logical?
Significant?
Fair?

Summary
The story that began this chapter showed us that people reach faulty conclusions even when they try to keep an
open mind and discover the truth: the police thought they had solved the crime, and Lin thought she had found a
better explanation. They were both wrong.
With a truly skeptical attitude, somebody would have doubted both explanations, put them to one side, and
investigated further. They would have been open to alternative explanations and would not have been averse to
changing their mind even once they thought they had the correct answer.
Scientific skepticism is not easy. It takes vigilance and discipline to learn, but like critical thinking and other skills
that we discuss here, you can hone your skills. The processes can become more automatic and less effortful as you
develop your expertise.
Next, we will look at how to deal with claims you see in the media. That includes social media, so it should be a
great way to practice your skeptical attitude!

Takeaways
1. When assessing claims, act like a scientist: see whether the claim is verifiable and falsifiable. If not, perhaps
somebody is asking you to believe something without sufficient reason.
2. When making decisions and forming conclusions, keep a balance between skepticism and open-mindedness.
3. To reach the truth, aim for lucidity. Sweep your preconceptions out of the way and experience the world as it
really is, without your previous experience blinkering you to new facts and evidence.
4. Keep the postmodernist view in mind: perhaps we can never know the truth, and perhaps meaning is
completely relative. If that is the case, many things are possible.
4

WHY THE MEDIA CAN MAKE OR BREAK OUR THINKING

ion Davis was sitting in his corner office early on a Friday afternoon, signing off business expenses. So far, so
Z normal. He had had a long, busy week, so the routine task appealed to him. As a manager, he was generally
well-liked, not least because his civil engineering background gave him credibility with the office staff.
As he was nearing the end of the task, an email notification pinged onto his screen: “Interesting read.” The
message was from Alaistair, a junior team member he worked closely with, so he decided to take a look. Almost
immediately, he wished he had left it until Monday.
The email contained an attachment: the environmental report he had been waiting for. He proposed ‘rewilding’ a
section of the development, alongside the approved construction of a visitor center and venue for various outdoor
sports activities. Alastair included a web link which he said Zion should take a look at.
Zion clicked through. He scrolled through lengthy, emotive paragraphs about the ‘failure’ of corporate
environmental endeavors like theirs. He sighed. This was going to take the weekend to sort out.
Later, Zion strolled into the open-plan office Alastair and the other juniors shared. Immediately, his team
bombarded him with questions:
“How can the company justify this?”
“This is outrageous! I can’t believe our company would do this!”
A few people argued back in favor of the company.
The conflict was not restricted to his immediate team, though: the MD, Milton Skelpie phoned Zion, ordering him
to call him in private immediately.
“I thought we were clear. This project is over 80% ecological work specifically to support wildlife, so why are the
environmentalists up in arms about it? And why are half of your team on their side?”
“It’s this article, Sir. They’re saying that we should leave nature to take over by itself and that anything we do will
make it worse. The article’s bogus, Sir-”
The MD cut him off.
“Sort this out, Zion. I’m relying on you. ”
Milton hung up. Zion took a deep breath, stood tall, and re-entered the shared office. He looked around at his
team.
“Everybody relax. We can tackle this together. So, we have some problems. I’ve read this article suggesting that
our wildlife park project will harm the local environment up in Washbrook, and my boss told me that several of
you have been posting about it on social media today. I’m going to ignore that because we have bigger problems.
As I said, we can all work together to solve this.”
The next few days were some of the most hectic and challenging Zion had ever experienced. He delegated
research tasks to several different staff members. He asked them to research the article, examining the platform
that had published it, who had written it, where they got their information from, articles that cited this one, the
whole gamut. He received a diverse set of reports.
Two of the staff, Meredith and Marco, had picked up one interesting fact: the article used some of the same
phrases (and misspellings) as a blog post published towards the start of the project. A fringe group wrote the blog,
and their main purpose seemed to be to block any kind of development. Members encouraged each other to lie
and exaggerate to get their point across; they planted misinformation to stoke readers’ emotions and make them
angry .
Zion was ready to present his findings to his superiors when the MD visited in person, completely out of the blue.
“Zion, this is serious. The local county council is now concerned that we misled them in our planning application.
Local residents are protesting at the site, stopping construction vehicles from entering and chaining themselves to
trees. We’ve already lost thousands because of the delays to this project.”
Zion called Meredith and Marco into his meeting with the MD.
“Sir, I would like to introduce you to the only two people in this office who picked up that this article is
hogwash.”
They spent the next hour presenting their comprehensive, detailed findings of the article that had caused so much
trouble: the source was not credible; the story was a distorted mishmash of second-hand information and opinions;
it played on people’s emotions; it misrepresented the science. As Zion said, it was hogwash.
Happily, this convinced the MD. He was so impressed that he assigned Zion and his team to write a well-
researched article on the topic for the company’s website.
After a difficult week, the project was back on track, and Zion had gained even more respect from his team and
his superiors than he could ever have expected. Still, the fake news article had almost caused a catastrophe .
Have you ever had an experience like this? Perhaps, or perhaps not. The point of the story is that a lack of media
literacy can have huge potential consequences.
Several of Zion’s staff believed what they read without investigating where the story came from; they failed to
seek further information, which led to conflict. It could even have harmed the business. Zion’s quick, decisive
action averted a potential crisis. Even better, he used his critical thinking skills to produce a report and web article
discrediting the disparaging claims.
Also, let us not forget the councilors and local residents who also fell for the disinformation, and the protestors
who would have been ashamed when they realized they had disrupted something that fit with their values, rather
than opposed them. Surely they would rather have spent their time and energy protesting against something
worthwhile?

Critical Thinking And The Mass And Social Media


This chapter’s story illustrates that people vary in how much they believe what they read in the media. Some of
the characters discovered something they found upsetting and took to social media to spread the word. This had
multiple impacts on the characters, the corporation they worked at, and its stakeholders (local authorities and
residents), not to mention the threat to the wildlife park itself .
Today’s society relies on mass media and social media as its main sources of information. However, the
information these sources publish is not always what it seems. Consequently, both types of media can be harmful
to people's wellbeing. Media consumers who do not discern between truth and falsehood, either because they
decide not to or do not know how to, can ultimately suffer.
When we use the media, it is important to apply the rules of logic and our own experience, just as in other
situations. We need to read beyond the articles themselves to understand their purposes and the effects the authors
intended to have on their readers.
The news is a great example of emotional manipulation. News outlets use forceful language in headlines to stoke
readers’ emotions, but the story is sometimes less exciting than it sounds. Fake news overtly appeals to people’s
emotions, and this is one reason why it spreads so effectively. People tend to read fake news in an unskeptical
way, sharing it with others and thereby spreading it further.

Big News Or Fake News?


Fake news is big news these days, but what exactly is it? Journalism experts define fake news as news that
somebody has deliberately made up to deceive the public, instead of satirical comedy based on news or innocent
mistakes that the news media might make from time to time. It is also not published by traditional media that
adhere to traditional standards of journalistic accuracy [62] .
Therefore, instead of simply reading a message and believing it, we should consider the source and look for
alternative perspectives elsewhere. This shifts the information we receive towards an overall balance, and any
biases will become more obvious by contrast.
Many media outlets publish in good faith; they may be politically biased, and they may want to entertain and
inform, but their news pieces are at least based on real events. Some types of news, however, are deliberately
written to mislead and even deceive readers.
Fakers deesign fake news to appeal to readers’ emotions and inflame prejudices and divisions. The purpose of
fake news is usually to make money or influence political opinions. None of this is new, but fake news really took
off in the era of instant access to news stories and social media.
So how can we spot fake news? Like online scams, fake news websites often imitate credible sources, and
creators of fake news are getting better at making it look official [63] . One famous fake news publication used the
domain name abc.com.co, mimicking the genuine abc.com. Fake news may also use falsified images, which
people find increasingly difficult to spot [62] . It may also feel too good or too shocking to be true since fake news
is composed to make people feel highly emotional .

Why And How Do People Get Lured In By Fake News?


Part of the answer is other people. A large study analyzed ten years’ worth of verified true and false news stories
on one famous social media platform and found that fake news spread significantly ‘faster, further, deeper and
more broadly’ [64]. Users were more likely to spread fake news stories than true stories, with fake stories
reaching up to 100 times as many readers. Several factors probably fed into this.
Fake news stories were more novel and, therefore, more attention-grabbing. People prefer to absorb novel
information instead of run-of-the-mill information, and social media may be popular because it supplies more
novel data. On social media, people interact with unique networks of other people who supply them with both
information and entertainment. Novel information flows particularly well on social media [65] .
It might seem self-explanatory that fake news is highly novel because it reports things that have not happened in
real life. However, the investigators controlled for novelty statistically and concluded that this alone could not
explain the viral nature of fake news [64] . Therefore, there were other factors at play.
The same study showed that fake news elicited stronger negative emotions - fright, sadness, and disgust - and
more surprise in users, judging by their reactions. True news stories made users express more sadness, but also joy
and trust. Surprising visual scenes attract people’s gaze [66] , as do scenes that elicit negative emotions [67] , so
these factors probably enhanced attention to the fake news stories.
Additionally, people are more likely to spread misinformation if they think it will create emotional effects. This is
a powerful driver of both the spread and persistence of fake news [68] . The social aspect of social media is
extremely powerful. For many users, the desire to make an impression on other people is probably stronger than
the desire to communicate the truth.
Perhaps surprisingly, bots did not spread fake news any faster or wider than they spread real news [64] . Therefore,
the persistence of the fake news stories was due to human users. Rather than sharing true news stories, they
preferred to spread things they found novel and shocking.
A further characteristic of fake news is emotional manipulation: its writing style and structure elicit strong
emotions in response to a particular idea and then shift readers’ responses onto another idea. For instance, a fake
news post might accuse a political party of racism and then aim to transfer the resulting anger onto other policies
declared by that group, in a kind of negative halo effect. Fake news also sometimes uses readers’ negative feelings
about one topic to elicit negative feelings about a different topic [63] .
'Clickbait' is web content that draws people in by appealing to common feelings and goals (such as making
money, improving social relationships, or finding out the truth about something). It is often news-like. The
designers dress up false or misleading information to look plausible in order to to entice people to click through.
Clickbait makes the user’s goal seem easy and within reach, with a knock-on effect of dialing down critical
thinking and making the information feel believable. The information does not matter as the sponsor company has
already made their money when the user clicks through to the site [69] .
Examples Of Media Deceptions
Mass media and social media convince people to believe certain things, regardless of whether they are fact,
fiction, or something in between. There are innumerable examples from historical and contemporary media.
In 1782, during the US Civil War, Benjamin Franklin concocted a fake newspaper supplement about native
Americans and the British uniting to scalp 700 Americans. He then sent the false story to his friends, who sent it
to their friends, and so on, and the story even made it into the real newspapers [62,70] .
An example of fake news closely mimicking real news came in the most shared US fake news story of 2016:
“Obama Signs Executive Order Banning the Pledge of Allegiance in Schools Nationwide” [62] . The graphics and
web address resembled a journalistic news source, resulting in over two million shares within two months .
A more troubling example relates to the backlash against vaccinations. The anti-vaccination movement arose from
Andrew Wakefield’s notorious discredited 1980s article linking the MMR vaccine with autism [71] . Most of
Wakefield’s colleagues later retracted the article to clarify that they had found no causal link, and in recognition of
the harmful effects, the misinterpretation of their results led to [72] . However, retractions are rarely as popular as
the original untrue story [68] .
Finally, in 2017, fake news reports circulated on social media stating that someone had murdered the president of
South Sudan and that his aide was plotting a military coup. The posts aimed to stir up further violence in the
country that was already suffering due to civil war, but it turned out that the reports were untrue and originated
outside of South Sudan [71] .
We must wonder how nefarious people achieve this deception. Is it something about how they present the
information? Fake news often features appealing storytelling, sensationalizing, and plays on compelling emotions
like fear and desire, but what about how readers receive and process the information?

What Is Media Literacy And Why Is It Important?


Media literacy means applying information literacy principles when you read media reports. Information literacy
means knowing when you need information and finding, evaluating, and using the appropriate information for
your needs. Researchers believe that the public has highly variable information literacy, specifically assessing the
quality and truth of information. This is also true for social media information, which has no inbuilt quality
control system; at least traditional media has editors and journalistic standards [68,69] .
Media literacy is more important now than ever before because of the sheer number of messages we encounter
every day. It is easier to access media than in the past, as most people are constantly connected, and catch up on
the news by reaching into your pocket.
Several factors may underlie people’s reluctance to be skeptical about what they read in the media and social
media. These include a lack of awareness about information literacy and the need to read information critically.
Further obstacles include confirmation bias, increasingly convincing fake news items, and the rapid progress of
technology that we use to consume news [63,69] .
According to large-scale surveys, most American adults get at least some of their news from social media. In
recent years this has expanded to include adults over 50 years old for the first time [73] . This means it is vital for
us to apply media literacy principles when we read and share information .

Practical Ways To Apply Critical Thinking When Reading Articles


So how can we avoid being influenced by fake news? One idea is to treat it as fun fiction and delineate it clearly
in your mind from real news. Get your news from trusted sources. You could also find one of the numerous online
fact-checking websites and use those as part of your investigation.
We already possess intuitive strategies for assessing the truth of things we read and hear, but these enjoy varying
levels of success. We tend to rely on a few features of the information [68] , but each of these is prone to error:
Compatibility : does the new data fit with our current beliefs? If so, it feels right to us, and the evidence weighs
more strongly in favor of the new data also being true. People prefer information that fits with their world view.
Internal consistency : is the information plausible and consistent with itself? People prefer good stories where the
plot points follow from one another, and the characters behave in a realistic way.
Credibility : we make a quick assessment of the source but the surface characteristics such as whether it features a
familiar person often grab our attention, as opposed to contextual information like the story’s purpose or where it
appears.
Consensus : whether others also believe the story is a rough indicator of its accuracy, we may over-rely on it. For
instance, if people we perceive as similar to ourselves believe something, we are more likely to believe it even if
there are other signals that it is untrue.
When reading articles, bear these four points in mind and use them to try to aid objectivity. However, these are
descriptive: they tell us how the mind works by default, rather than giving us the best method to appraise news
stories critically. Keep an open mind about the story, and assess it using a more critical mindset than the one your
intuitive decision-making processes might tempt you into.
Rather than relying on intuitions, teachers tell their students to assess online information using the CRAAP test
[70,74] . Students learn to assess whether the information is current, relevant, authoritative, and accurate and look at
its purpose. As you are already a critical thinker, you are most likely familiar with how to assess information
against these criteria, but here is a brief reminder:
Current : check the dates of both composition and publication, look for any updates or retractions, and research
further sources that reference this one for the most up-to-date information.
Relevant : assess whether the material applies to what you were looking for, whether the language fits with the
topic, and whether they have covered it properly. In the case of articles you did not specifically search for, think
about whether the title and introductory section present the topic fairly; sometimes, fake news articles use
misleading headlines and images to draw readers in.
Authoritative : look for the author’s credentials and determine whether they are qualified and experienced to write
on the topic. You can also examine whether they cover the content in a logical and appropriate way.
Accurate : this can be more difficult to check for news stories and particularly social media stories. If the author
cites sources, see if they are academic or official sources. If they quote scientific results, you can see if they were
published in mainstream journals (indicating that the scientific community, in general, accepts these results as
valid investigations) and see whether other scientists have replicated the results.
Purpose : review the details to see whether the author uses the article to sell products or attract visitors to their
site. You may find evidence of vested interests and/or biased opinions that affect how they treat the topic.
Although CRAAP is a useful checklist, some authors have criticized it, as you need to spend a serious amount of
time investigating the source website itself. Lateral reading is an alternative approach that we can use to assess
online sources more quickly and fully. This entails fact-checking beyond the site or story itself, including
performing a web search for the source or author [75,76] . It is also important to know that search engine results and
social media feeds are heavily personalized. Still, not all students using online information fully understand these
facts [75] .
As well as lateral reading, expert fact-checkers can accurately categorize web sources by taking bearings. They
evaluate the source website using the ‘About Us’ section and visit other online sources almost straight away to
look for wider information about the source’s authority and trustworthiness [76] .
Establishing whether the source is credible is key to using a critical thinking approach to gaining information
from the media. However, the rise of fake news may correlate with the general public’s diminishing trust in
experts and government sources [77] . The internet and social media lie at the root of this trend. People
increasingly believe they can find out anything by going online and that all opinions are equally valid, whether
expert or otherwise [71] . This means it is becoming increasingly challenging to distinguish between credible and
non-credible sources, but remember that a critical thinking approach can help you get closer to the truth.
It is enormously important to assess the credibility of the source. Using lateral reading, you can gauge whether
they consistently report facts and whether they admit and publicize mistakes in their reporting. You could also
check mediabiasfactcheck.com, which summarizes global news outlets in terms of their overall accuracy
percentage, and highlights political bias. This is useful for evaluating sources you may not have come across
before.
In addition, you can also consider whether the article meets the standards described in the Society of Professional
Journalists’ Code of Ethics. Their ‘Seek Truth And Report It’ criterion covers many of the standards discussed in
this chapter [78] . Remember that most social media sources are not answerable to this code.
You can also use online tools to verify the content. Some helpful sites include:
Factcheck.org : focuses on US political claims across various media.
Snopes : investigates various categories of potential misinformation, including hoaxes and urban legends as well
as news items and political claims.
Factscan.ca : fact-checker for Canadian politics.
BBC Reality Check : includes fact-checking and explainer articles for the UK and international current affairs
topics.

Action Steps
1. Media Literacy Practice
Perform a general web search for a topic of interest and assess two of the resulting articles or webpages using
CRAAP and lateral reading. Notice whether the two approaches give you different impressions of the sites,
perhaps even leading to different conclusions.

2. Deep Div e
Choose a news story that interests you; perhaps it relates to your business or personal concerns. It could be one
that you found during Action Step 1. It should be sufficiently complex and mainstream for you to find at least four
different sources. Research the information these sources report, aiming to be as diverse as you can. For example,
look for left and right-wing sources from both the mass and social media. Chart on a piece of paper what they
agree on and what they disagree on. Can you see different 'facts' reported? What about word choices indicating
bias? You can repeat this exercise in the future if you want to assess another news story in depth.

Summary
This chapter’s story showed how fake news concocted by extremists snowballed and nearly spelled disaster for a
company and a community. This fictional story’s message was serious: many real-world fake news stories have
had terrible consequences. The few examples given here should give you an idea.
The mass media is now immune to getting things wrong. Still, even journalistic outlets vary in the quality
standards they set for themselves, so it is important to apply your critical thinking skills here too. Three of the
characters in the story displayed great analytical skills in picking apart the mess of blogs and social media posts
that led to the misinformation problem. They presented their findings rationally and calmly that defused the
situation. In the end, this positive outcome could have even enhanced the company’s reputation.
Next, we move on to look at how others try to deceive us to our faces and how we can sort the truth from the lies
in these everyday situations.

Takeaways
1. To separate sense from nonsense in mass media and social media, we need to apply the rules of logic and use
our own expertise.
2. We need to be alert to fake news, which is deliberately concocted to fool people, and not confuse it with real
news, satire, editorial opinion, propaganda, or advertisement [69] .
3. Take a skeptical approach even if the story feels true, and beware of ‘news’ that seems too extreme to be true.
5

EVERYDAY LIES AND DECEPTION

fter reading the fine print, Alicia decided she was happy with the terms of the business loan.
A She had recently met Aaron Lowen, a business development consultant from Howsville, the next town along
from hers. He strolled into her ice-cream parlor and quickly persuaded her to open another cafe in Howsville. She
refused at first: she liked running a single site business, and her customers found it charming to buy ice cream
from a family-run concern. The expansion was too risky.
However, Aaron insisted.
“They don’t even have real gelato in Howsville! With this artisan Italian ice-cream, you’ll make a fortune! I
promise you, there are no decent ice-cream cafes at all.”
A smile flitted across Aaron’s face. Quickly, he looked serious again .
“Is this really a good opportunity?” Alicia asked.
“Yes, definitely,” Aaron grinned.
Alicia noticed a strange wobble of the head, but thought no more of it.
So here she was: opening a new café. Once the loan was in place, it was all hands on deck.
However, Aaron had not been a hundred percent truthful. A local gourmet ice-cream company was running trucks
and pop-up cafes across town, and they had no qualms about targeting her new store. Sometimes the local kids
would even come in to criticize her product:
“Not as good as Toni’s.”
The trouble at the new branch rapidly damaged the entire business. It seemed time to cut their losses. Then,
vandals broke into the new café. They wrecked the displays and littered ice-cream and toppings everywhere.
Alicia closed for the day, and her employees cleaned up while she called the police and the insurance company.
This was almost the end of the whole company, but Alicia smiled and kept going. Her sister-in-law gifted her
some of the profit from her own business, which kept Alicia afloat for a while. Sadly, the new café was still not
viable, so Alicia decided to close down.
On the last day, they organized an ‘everything must go’ event, with half-price ice-creams for all the local high
school and college kids. Late in the afternoon, this turned into free ice-creams for all.
Alicia confided in a middle-aged lady who was enjoying a cookie and cream cone. The lady was sympathetic:
“It’s very sad, but Aaron from Toni’s has such a good grasp of the local business environment and so many
friends and contacts in the town. You were brave to compete with him.”
“Aaron who?”
“Aaron Lowen, our local entrepreneur. He’s involved in most of the businesses in town, and even wants to open
up in your town as well. Can I get some white chocolate sprinkles with this?”
In a flat tone, Alicia directed her to ask at the counter. So Aaron had lied.
Finally, she had found the missing piece of the puzzle: Aaron was deliberately trying to put her out of business,
and it had almost worked. He had almost cost her everything. If Aaron had succeeded, he would have been the
number one ice cream seller in both towns!
She had to applaud his audacity: pop-up ice cream cafes and trucks, rather than fixed premises, meant she had not
discovered that there was already a popular artisan ice cream maker in Howsville. So she was back to her initial
position, but it could have been much worse .
A few months later, things had improved. Sympathetic locals who heard about the diabolical deception flocked to
Alicia’s home town cafe. It was a warm spring, so she added two bicycle-based ice cream sellers. All this led to
record sales, as well as bad publicity for her rival Aaron.
Alicia was intelligent and successful, but she missed the signs of deception. Aaron gave away some clues: the
quick smile that flitted across his face when he claimed there were no ice-cream cafes in Howsville, and the head
wobble when he confirmed that it was a good opportunity, betrayed his real opinions. He promised something that
sounded too good to be true, and he appeared trustworthy, using his expertise as a business consultant to add
credence to his claims.
Alicia noticed these clues but did not know how to interpret them. She did not know that even accomplished liars
reveal themselves occasionally, as the human body and face express our emotions even when we work hard to
suppress them.
Most people are basically honest, but one deliberate deception could potentially cost us a lot. Therefore, as well as
examining claims and evidence in detail, being skeptical about ideas, and examining evidence, we need to look at
other clues that can tell us if somebody is lying, whether it is unintentional or deliberate on their part .

How To Spot A Liar


Outside of the media, people we interact with every day, expose us to a great deal of information, much of which
is true. Lies are deliberate conscious deceptions, in which people either conceal something or falsify information
[79] , so people have a huge interest in methods for detecting when somebody is lying.

There is no single clue that tells us somebody is lying. Instead, we must draw tentative conclusions based on as
much evidence as we can find [80] .
We can apply critical thinking to the content of what people say, and when we interact face to face, there are
several additional sources that can give us clues as to whether somebody may be lying. Liars can accidentally
reveal the truth by leaking information or emotions they are trying to hide. A few behaviors might clue us in (but
note that these behaviors rarely reveal the content of the lie).
Liars sometimes work hard to conceal a lot of emotion, and we can detect this cover-up by gathering evidence
from their faces, bodies, and voices, for instance, if somebody seems panicky. People who tell the truth expect
others to believe them, so they appear more relaxed [81] .
The words used provide the first set of clues. Three ways that somebody’s words can suggest deception are:
making errors in repeated facts; slips of the tongue (stating the real fact or situation by mistake); and saying far
too much. In the latter case, you might identify a liar because their explanation is overly elaborate and detailed,
suggesting they are trying desperately to convince you.
However, liars focus on faking their words and facial movements, whereas their voice and bodily movements are
less easy to falsify [79] . Scientists have studied interpersonal signals from faces, voices, and body language
extensively inlie detection.
Many people believe that the eyes give away true feelings in terms of interpersonal signs, and liars often
deliberately try to appear truthful using the eyes, such as making plenty of eye contact. However, the impression
conveyed by eye gaze differs across cultures. Some regard direct gaze as disrespectful, which may affect
suspicion of guilt when police officers arrest or question people of different ethnicities [80] . Because eye gaze is
so deliberate, it is perhaps a poor indicator of somebody's inner feelings.
Blinking is more spontaneous than eye gaze, and pupil dilation is not under conscious control. Therefore these
provide more reliable signals of genuine feelings. Changes in frequency of blinking and wide pupils could signal
emotional arousal associated with deception, but this is inconclusive since they are signs of general emotional
arousal. Pupil size was the best indicator of lying in a meta-analysis comparing various signs of tension in liars
[81] . Additional bodily signs of tension include sweating, pale face, and flushing, but these are general to

emotional arousal and not specific to lying .


Facial signals are extremely complex. Liar’s faces communicate two interesting strands of information: what they
are trying to communicate and what they are trying to hide. Liars often try to conceal their true feelings, with
varying success. Good ways of reading faces to detect deception include [79] :
Passing expressions : people often express their feelings spontaneously but then quite quickly suppress them. This
rapid masking of expressions can be a clue to dishonesty.
Micro-expressions : these are much faster than the passing expressions described above. We do not typically
perceive them, but we can see them on paused or slow-motion videos. Trained psychiatrists can observe them in
normal conversation, having learned to do so through their professional practice.
Specific parts of the face : some areas of the face are more informative than others. For example, people fake
smiles using only the mouth and lower eyelids, whereas a genuine smile of happiness features raised cheeks and
wrinkles at the corners of the eyes. People also find it difficult to fake the pursed lips of anger.
Smiles : if somebody smiles too much or at odd times during the conversation, they may be using a social smile to
conceal nerves. If their smile disappears rapidly or slips off the face in unnatural steps, they may be using a fake
smile to conceal other emotions.
In terms of voice clues, listen for pauses and a lack of fluency. If the speaker often pauses or for a long time,
hesitates more than usual, or uses filler sounds like ‘ah’ and ‘um,’ they may be improvising. The high cognitive
load of composing a lie while speaking to you might be taking a toll on their verbal coherence [79] . However, this
may be unreliable. One linguistic study found fewer uses of ‘um’ and similar words in lies, suggesting that these
fluency errors are perhaps part of normal speech rather than a sign of high cognitive load [82] .
Raised pitch is a further vocal signal of lying [79,83] . People find fear and anger difficult to hide vocally,
especially if the lie they are telling makes them feel that way, or they are worried you have figured out their
deceptiveness [79] .
According to some acoustic research, people produce shorter utterances with fewer syllables, speak more slowly
and take longer to respond, and vary more in pitch and intensity when they lie [83] . A meta-analysis also suggests
that liars say less and provide fewer details than truth-tellers [81] .
All of these modes of non-verbal communication provide hints that somebody might be deceiving you. However,
bodily clues might be the best ones to look for. Experiments show that observers are worse at picking up
deception from facial and vocal cues than bodily cues. Participants who only saw bodily postures and gestures
were much better at spotting the liar [79] .
Remember that everybody has a different baseline. You are likely to be better at detecting lies from somebody you
know than a stranger. It helps to distinguish between these three categories of bodily movement:
Emblems : movements like nodding the head, shaking the head, and shrugging. They are often intentional
communicative gestures but can occur without consciousness as well. Unintentional emblems are often smaller
versions and may reveal how somebody really feels.
Illustrators : often spontaneous, you may refer to these gestures as gesticulation. Many of us use our hands a lot
when speaking: we draw pictures in the air and mime actions. People use fewer illustrators when they lie, perhaps
because it is more effort or they are uncertain about what they are saying [80] .
Manipulators : pinching, stroking, scratching, hair twisting, and similar movements, including fiddling with small
objects. People engage in more of these gestures if they are nervous, but also when they are feeling relaxed, so it
is not a great clue to deception.
Be cautious when interpreting faces, voices, and body language, and remember each of the methods described
above is only a single clue. Listen closely to what the person says, as well as, observing how they behave. So can
we find any more certainty ?

What They Say Or What They Do?


There is no magic formula to discern whether somebody is trying to deceive you or is giving you misinformation.
Further, there are numerous myths and misconceptions about how to detect lies. Some experts claim that a
significant portion of the police and customs training materials on lie detection from interpersonal cues are
incorrect [79,84] . This suggests the scale of the problem, but remember that police and customs are working with
individuals that they have never seen before, who will be feeling victimized and perhaps even panicky due to
being apprehended.
The clues described here are simply clues. They can give you an idea of whether somebody is deceiving you: with
an abundance of clues, the likelihood of deception rises. However, the fact that you are dealing with likelihood
rather than certainty raises the problem of false positives and negatives. The risks and consequences obviously
differ depending on the situation. Is it worse to doubt the truth or to believe a lie?
For instance, anybody working in law enforcement or other high-stakes occupations should be extremely cautious
about how they interpret the interpersonal signs of deception. They should keep an open and skeptical mind and
continue to gather evidence.
A further problem is that lying is a social interaction, and if somebody feels uncomfortable, they may give off
signals similar to those of deception. If anyone has ever falsely accused you of lying, you will remember how it
feels. Similarly, shy individuals or people who find themselves in an aversive situation may be mistaken for
deceivers simply because they feel nervous and untrusting. In both scenarios, the accuser could misinterpret the
accused’s discomfort as guilt signals, further fuelling the false accusations.
Ideally, get a solid baseline so that you know the person and their general demeanor. People in close relationships
(both romantic entanglements and friendships) may be better at detecting each others’ lies, but they are also better
at deceiving each other. One reason for this is that they are familiar with each others’ typical behavior and modes
of speech and so can fake it more easily. Another reason is that their desire to maintain a positive relationship
leads them to ignore the signs of deceit [85] .
Romantic relationships are paradoxical in terms of honesty. Most people agree that they want a potential romantic
partner to be honest, whether the imagined relationship is long-term or dating, but they do not want extreme and
absolute honesty. Instead, many people realize that sometimes deception helps build self-esteem and can be an act
of kindness.
In very emotionally close relationships, the evidence on lie detection is mixed: sometimes it seems that romantic
partners are better at detecting lies in each other; other evidence suggests the opposite [85] . Perhaps it depends on
the specific, highly personal details of the relationship .
To detect lies of all types more successfully, we need to look at the communication in context, including its
content. We are more likely to succeed using a rational approach, comparing what they say to other evidence and
our prior knowledge than by over-relying on signals from the potential liar’s face or voice [81,86] .
In terms of what they say, liars give less consistent accounts, told in a less personal way, that feels less plausible to
listeners. Their stories seem to be told from a distance and are less clear; these effects are more reliable than non-
verbal cues such as eye gaze and expressions [81] .
Another clue is the structure of the story. Liars are more liable to tell you what has happened in the order that it
happened, whereas somebody speaking the truth moves around the story’s timeline. This structured approach
suggests that liars have carefully composed their story but, ironically, end up relating something that sounds less
natural [80] .

Nobody Is Immune
Everybody is susceptible to believing lies and half-truths [79,81] . Even the experts get it wrong sometimes.
People are overconfident in detecting lies from the face and voice, which acts as a barrier to finding the truth.
Faces, in particular, are used for communication and expression, so any expressions read from the face need not
reflect how somebody really feels. Further, communication via the face varies in different cultures and also among
individuals. We cannot extract any solid rules for detecting a liar from their face or voice with so much variability,
although they do provide some clues. We would be better off using evidence and trying to establish the facts [86] .
Interestingly, one study suggests that trained police officers may be no better or worse at detecting lies regardless
of whetherthey focus on the content of the lies, the person’s face and voice, or their body language. Their
accuracy was 50/50. They may as well have flipped a coin. This illustrates that even trained professionals might
be fairly poor at working out when somebody is lying, despite high confidence. A similar study showed that
practice in itself improved people’s performance, but instructions to attend to certain cues (face, voice, and body
language) had no effect [84] .
One reason people get it wrong is that they assume that others are honest [68,79,86] : we believe others by default.
Most of us tell the truth most of the time, so the bias towards belief normally leads to correct conclusions and
better cooperation. We rarely question others’ honesty unless something makes us suspicious; however, there is
evidence that many people are poor liars even when they attempt to deceive [86] .
Evidence suggests that lying is not particularly common. A study that asked people how many lies they told over
a day showed that most people reported no lies at all, the vast majority told one or two lies, and a small number of
‘prolific liars’ told half of all lies reported. You might suspect that people lied to the researchers, but that would
not explain the distribution of the responses. Further, hundreds of people participated in three separate
experiments, adding credence to the conclusions [87] .
When people reflect on how they have detected lies successfully in the past, their answers point towards two
strategies: comparing the lie to the available evidence and persuading the deceiver to confess. Obtaining evidence
relies on getting contextual information around the lie, so you are likely to be better at detecting lies within your
own domain of expertise. Surmising that somebody has a motive to lie raises detection accuracy to almost 100%,
and it is useful to use probing questions. Again, experts are better at this [88] .
If you suspect somebody is lying to you, encourage them to talk. Get the person to repeat their story and listen out
for factual errors and inconsistency [80] .
We all need to be aware of our own motivations, emotions, and preconceptions and do our best to avoid letting
these color our perceptions of others. Overall, it is difficult to decode when somebody is lying to us. Luckily, in
the case of our social relationships, minor lies are often inconsequential or even positive. However, modern life is
full of scams and other deceptions, which could be potentially very damaging .

Examples: Is Truth Rarer Than Fiction?


Here, we examine common scams and deceptions that unscrupulous people and companies might use to sell
things or convince people to believe false ideas. We can look for the signs of deception already described, but
there are many problems with this. Firstly, those who deliberately set out to deceive people are more likely to be
practiced liars, so they may display few signs of stress when they are deceptive. Secondly, in the case of online
deception, we may not be faced with a person at all, so we have to use other clues in the message itself.
Three of the most common online scams [89] are:
Advance fee scam : if you use email, you have probably received one of these. A message purporting to originate
from a relative of a wealthy person asks for your assistance with accessing their funds. They need a small amount
of cash in advance, and will pay you a huge fee once they have accessed the treasure trove. Other advance fee
scams include pyramid schemes and ‘work from home’ type business opportunities. Fraudsters circulate these
scams to thousands of people, but only a few need to succumb for the scammer to make a profit.
Online selling scams : auction fraud, not delivering goods, and not paying for goods purchased are the most
common. Sometimes identity theft and misuse of credit cards are also involved. ‘Scareware’ is another online
selling scam,in which fake popups tell the user their computer has a virus and they need to pay for a special tool
to remove it.
Investment scams : when a swindler convinces people to invest in a fake business. The business sounds genuine
and has a professional-looking website, which anybody could easily create with a small outlay and some web
design skills, which leads a few people to believe it is genuine.
Like many other forms of deception, these three all rely on people falling for a falsified desirable idea: the large
payout following the small outlay, the honest seller or buyer, or the profitable investment. People’s emotions may
override their more critical faculties and lead them to fall for something that many others would find unlikely [90] .
The fraud may even be so convincing that it even fools people who are confident in their skills at scam-spotting.
The assumption that people are mostly honest probably also plays a role [86,88] .
The more a fraudulent claim plays on people’s hopes and wishes, the more people will believe it [90] . So, it is
important to report scammers to the authorities, but what if you are unsure whether something is a scam? US
federal government [91] and the UK Citizens Advice Bureau [92] give useful advice. All of these are signs that you
should be suspicious and not hand over any personal information:

Does it seem too good to be true?


Is the contact unexpected?
Do they pretend to be from a trusted organization such as your bank or social security?
Do they ask for your personal data such as PIN or password?
Is there a problem you need to solve (like a huge tax bill) or an unexpected prize?
Do they rush you to act immediately?
Do they ask you to pay in an unusual way like money transfer or gift card or send you a check (that later
turns out to be fake)?

Outside of straightforward scams, real businesses and organizations sometimes engage in trading practices that
may be illegal or at least dodgy [93] , for example:
Fake reviews and testimonials : these are common on online marketplaces. Sometimes celebrity testimonials are
used, and it is difficult to tell whether the celebrity gave their permission.
Unfounded predictions and promises : this may be illegal if the company knew a specific claim was untrue, but
fanciful advertising claims are usually allowed.
Bait advertising : this is when a company advertises a product for sale, but does not have a 'reasonable supply.'
The bait product lures people in; then the seller persuades them to buy something else.
Misleading guarantees, conditions, or warranties : for example, a seller cannot make you take an extended
warranty, but the salesperson might try to imply this; this con relies on customers not knowing the details of the
business' legal obligations.
With so many companies and individuals trying to make money from us, it is sensible to keep in mind that if
something seems too good to be true, it probably is. However, it would be cynical and destructive to apply this
attitude to our everyday interactions and relationships. Remember, there are only a few prolific liars around, and
they are probably busy running online scams.

Action Steps
Now that we have looked at how to use critical thinking and evidence to spot lies and deception in everyday life,
it is time to apply some of this knowledge. Try the following action steps.
1. The Lying Game
Play a game of lie detection with somebody close to you. Each of you can prepare a handful of lies and truths that
you will try to convince the other person are true. Remember this is a fun learning exercise, so use humorous or
innocuous facts about yourselves that the other person does not necessarily know. Use some of the techniques
covered in the chapter to convince them and try to detect the lies correctly, and have a conversation afterward
about how it went.

2. Proof Of Lie s
Try some of the techniques for spotting a liar. Find an online video from a few years ago of somebody you know
is lying because someone else exposed them or they confessed. This could be from politics, an interview with a
public figure, or a televised court case. Watch the video in slow motion and look out for some of the signals we
have examined in this chapter:

Physical signs of tension.


Fleeting and micro facial expressions.
Shifty body language such as small emblem gestures.

You could then do the same but listen for any acoustic signals, such as raised pitch and frequent hesitation,
perhaps comparing their verbal behavior to an example when you know they are not lying.

Summary
In the story at the start of this chapter, it turned out that the business consultant had seriously misled the business
owner: the rival company was a serious threat to her business’ expansion after all. How could she have picked up
on this?
Unfortunately, there is no surefire way to tell if somebody is deceiving you, especially if it is somebody you do
not know well. However, Alicia could have checked the facts: did the other neighborhood have real gelato? Was
the promise that there were no decent ice cream cafes in that town too good to be true? The deceiver also showed
a possible micro-expression (a fleeting smile at an odd time) and an emblem gesture when we slightly shook his
head, possibly revealing that he was saying the opposite of the truth. She might have been able to figure it out, but
perhaps assumed that this man was telling the truth because most people are honest.
In the next chapter, we will explore what some people might call a special category of scam. We look at
pseudoscience and how to distinguish it from real science and technology.

Takeaways
1. Tune into the visible and audible signs of potential deception: you can learn them through careful observation
and practice. However, you need to apply critical thinking to what they say and pair this with a keen observation
to get closer to the truth.
2. There is no sure-fire way to detect lies, but knowing the person or establishing a baseline will help. Even a host
of behavioral clues cannot prove that somebody is lying.
3. People believe others by default, and research suggests this is warranted as most people are honest.
4. Selling products and ideas is perhaps the exception; scams and frauds are sadly very common, but you can
detect them and overcome them using a skeptical, analytical approach.
6

PSEUDOSCIENCE VERSUS SCIENCE

hen Marlon’s Mom moved to her retirement apartment, he noticed something he found strange. The
W apartment was spotlessly clean, but they found the same small object in every corner of every room and
window recess.
Marlon assumed that the previous resident must have gone crackers. He or she had stashed a horse chestnut in
every corner they could find. The removal men carried on moving his mom’s possessions in, whistling happily as
they wedged the large couch into the small living room. Marlon heard a tiny wood-like object roll along the floor
underneath the couch.
“Excuse me, guys,” he said. “I don’t think Mom wants those chestnuts everywhere. Can you put them in the trash,
please? ”
The two assistants put down an oak dresser and looked to their foreman for guidance, but Marlon’s Mom
interjected before he could say a word.
“They’re fine, gentlemen. Please carry on,” she said to the removal men, giving Marlon a pointed look.
As the removal men carried on, Marlon looked to his Mom in confusion.
“Isn’t it odd that they just left these chestnuts everywhere? Why don’t you want them thrown out?” he asked.
His Mom gave him a superior look.
“They keep the bugs away, Marlon. It’s a tried and tested natural remedy. I would have thought you would
approve.”
Marlon could not help but burst out laughing, but his mother was clearly serious.
“Proven? Who proved it?” he asked once he had his breath back.
“Not your new-fashioned scientists. Housewives have known about it forever. Spiders are scared of the fumes
they give off or something like that. My Grandmother taught my mother, and my mother taught me. Did you ever
see a spider in my house? I thought not.”
Marlon was sensible enough to mumble in agreement and then drop the conversation. He had to admit he had
never seen a spider in his Mom’s house, but she spent an awful lot of time dusting .
In fact, science has found no evidence for Marlon’s Mom’s belief that horse chestnuts deter spiders [94] . This
particular erroneous belief is benign, but it illustrates the point that sometimes people simply believe in received
wisdom. Marlon’s mother believed her home was spider-free because of the chestnuts, exhibiting confirmation
bias. Still, as Marlon’s inner voice hinted to him, the lack of spiders was more likely due to her constant cleaning.
You might conclude that the mother’s belief in the chestnut deterrent was a harmless superstition, but are all
superstitions harmless? Where it gets more debatable is the case of pseudosciences. These are more complex and
far-reaching than superstitions; they involve entire belief systems.
A pseudoscience is a collection of beliefs or practices that people mistakenly regard as scientific. Sciences
challenge their own claims and look for evidence that might prove these claims false through systematic
observation and experiment. In contrast, pseudosciences aim to look for evidence that supports their claims,
seeking confirmation rather than falsification.

How (And Why) Science Works


Scientists analytically explore the world. The scientific process is the most reliable way of understanding the
world because it involves hypothesis-testing and various mechanisms to scrutinize the conclusions and evidence .
Essentially, science centers on things we can observe and measure. We can define science as:
“An interconnected series of concepts and conceptual schemes that have developed as a result of experimentation
and observation and are fruitful of further experimentation and observation” [95] .
Conceptual schemes mean theories and models, and hypothesis testing means generating a testable idea and then
using observation or experiment to test it. If we cannot observe something or test it by experimenting, it is
probably not a scientific idea. Scientists have an array of methods and techniques to scrutinize evidence and draw
conclusions, many of these are specialized to the individual field of study.
To understand whether a conclusion is valid, we should examine the techniques used to collect the evidence as
well as the evidence itself. We can then make our own judgment about the reliability of any associated claims.
Science should employ a rational approach, actively looking to make sense of the world logically. Scientists
describe their current understanding of a given situation with reference to the evidence, as well as an assessment
of their confidence in that understanding [96] .
Science makes certain very basic assumptions, including that objective reality exists and that we can observe and
measure it, and find out the truth about things [96] . Some adherents of pseudosciences might question these
fundamentals, making it difficult for science to compete fairly with pseudoscience.
Validity is a term research scientists use to ensure that their research methodology is relevant to what they want to
study. By assessing validity, they cue themselves to think clearly and stay on track. For example, a biologist
would consider blood tests a valid way of measuring liver function, but this technique would not work for all
biological variables.
Scientific research studies involve several formal stages. At each stage, the investigators try to remain as objective
and bias-free as possible. You can think of the stages described below as a repeating process because scientists use
their observations and experiments to develop their theories, leading to refinement of the theory, which leads to
further research questions [95,96] .
Observation : Many scientific studies begin with observations. This is when somebody observes something
interesting by chance, without changing anything to see what would happen. For example, a nurse might notice
that patients with a certain condition seem to recover faster when doctors prescribe them aspirin rather than
paracetamol
Formulate a research question : this is quite specific, but not usually something that investigators can cover in a
single study or experiment. It takes the form of “does variable A affect variable(s) B”. A research question for our
example might be ‘does aspirin improve the symptoms of the condition?’ The investigators might do several
studies to examine this.
Narrow the question : many research questions are too broad to form testable hypotheses, so scientists must
reduce them to questions they can examine in a single study or set of studies. Narrower questions for our example
could be: do patients given aspirin spend fewer nights in the hospital, do they live longer, or do they experience a
lower rate of relapse compared to those given paracetamol?
Conduct research : gather data that forms evidence from experiments or observations. Observational studies
gather data without changing the situation, whereas experiments change one variable while keeping everything
else constant. Research may be naturalistic, like our example with hospital patients, or scientists might contrive a
more tightly controlled artificial situation.
Analyze data : scientists use descriptive and inferential statistical methods to compare data against baselines, over
time, or between two or more experimental conditions. There are a huge number of methods available to analyze
qualitative and quantitative data, and this is such a specialized area that statisticians work as collaborators and
consultants in all scientific fields.
Draw a conclusion : delineate the most likely explanation of the data while also discussing alternative
explanations. This does not imply criticizing or outright rejecting alternative explanations since the chosen
interpretation could still be wrong. Further evidence might result in the scientific community re-interpreting the
result.
The scientific community is a key player in the continued effort of scientists to reach a better estimate of the truth.
Scientists write specialist articles and present their data at conferences to spark debate among their peers. Peer-
reviewed journals, where experts peruse the article before publication, publish most scientific papers [12] . This
gives readers reassurance that the methods are appropriate and the data justify the conclusions.

What Is Pseudoscience?
Now that we have a clear definition of what science is and its methods, we need to define pseudoscience. As the
prefix ‘pseudo’ implies, pseudoscience refers to beliefs and activities that might resemble science but are not
science [12]. We call them ‘not science’ because they diverge from mainstream scientific theory, and in some
cases, scientific methods cannot test them either [97].
The line between science and pseudoscience is not always clear, though. Investigators working in pseudoscience
are free to employ hypothesis-testing and scientific techniques to examine evidence and conclusions. But they
sometimes commit mistakes and produce misinformation in the process, and end up presenting incorrect
conclusions.
Examples of pseudoscience :
Alternative medicine : alternative therapists sometimes fail to specify how the therapy works or make general
references to things like the energy that the practitioner harnesses or directs into the client's body. It would be
difficult to devise an adequate control situation to compare to these therapies. Pseudoscientific therapies often rely
on hearsay rather than clinical trials, and this can be subject to confirmation bias and the hasty generalization
fallacy [1] .
Psychic powers : many people across the world believe in supernatural powers like extra-sensory perception and
clairvoyance. Believers and scientists alike find it difficult to test these ideas, and although many have tried, the
evidence is inconclusive [1] .
Astrology : predicting people’s personality traits and future events from the position of the stars, the Moon, and
planets is another ancient practice that appeals to people across the world. The predictions are vague and often not
falsifiable, and therefore have not been tested in a rigorous way like scientific theories [98] . Investigators have
found no correlation between star signs and personality traits [99] .
We should not confuse folk remedies and young sciences with pseudosciences. Be skeptical of ancient traditions:
they might work and might not, but age alone does not imply efficacy [95] . We should also be open-minded about
young sciences while establishing their methods and world views, although further scientific investigation may
falsify them. One example is germ theory, which the scientific establishment thought was implausible at first, but
further investigations confirmed that microbes, not foul air, caused diseases [61] .
People and communities hold biases, but so do scientists. The history of science shows that socio-cultural contexts
affect how scientists work, despite their drive to be bias-free. For example, the mistaken idea that the surface of
the human brain looked like the gut influenced early scientific drawings of the brain, even though the artist could
see a real brain in front of them [95] .

Why Do People Believe In Pseudosciences?


Confirmation bias and emotional appeal are two reasons why pseudoscience draws people in, but people do not
adopt unscientific beliefs for no reason. Some pseudosciences are ancient, even predating modern science and
medicine, and in these cases, people are sticking with what they already believe. We know that people are
reluctant to change their minds once they have decided [11] .
According to surveys conducted in 1993 and 2000, most American college students do not know the difference
between astronomy (the scientific study of the cosmos) and astrology. Scientific education can help people
distinguish pseudoscience from real science, but they may hold onto the pseudoscience as well, particularly if it is
a widespread tradition like astrology. Newspapers have been publishing horoscopes for generations, and their
appearance alongside news may enhance their believability [99] .
However, demonstrations of scientific reasons behind phenomena can cause people to revise their beliefs [97,98] .
For example, many people worldwide see solar and lunar eclipses as supernatural events, but they revise their
belief when scientists demonstrate that they can predict eclipses in advance [98] .
Some evidence points to shared traits among believers of pseudoscience. People with a ‘conspiracy mentality’ and
lower knowledge of science were more likely to believe in pseudoscience. Conspiracy mentality consists of
distrust and paranoia aimed at authorities like governments, which correlates with certain personality traits. This
mentality correlates with conspiracy theorizing, which correlates with rejecting scientific evidence [100] .
Another characteristic of believers in pseudoscience is a more intuitive thinking style: they engage in the faster,
more automatic reasoning style that Kahneman described [11] . However, instructing people to use the slow system
and think more critically can reduce automatic belief in unfounded claims [97] .

Distinctions Between Science and Pseudoscience


One major difference between science and pseudoscience is that pseudoscience seeks confirmation, whereas
science seeks falsification. When people claim that pseudoscience effects a cure for a disease, they work back
from the result and conclude that the pseudoscientific intervention caused it [1] . Real scientists, in contrast, are
skeptical of their own findings and theories, alongside being highly motivated to discover the truth [101] .
Some pseudosciences are not testable and rely on people having faith and belief that they work or explain things.
In contrast, scientific ideas are measurable and testable by definition. However, scientific ideas sometimes
contradict common sense, and pseudoscientific ideas sometimes seem to make more sense to laypeople [101] .
A further difference is that pseudosciences do not change and develop in the same way as sciences. Instead, they
either remain the same or change randomly, whereas scientists add new ideas to their science based on research
refining their theories. This generates more hypotheses for them to investigate. Pseudosciences develop more
haphazardly, with new ideas having no necessary relationship to the previous ones [12,95] , whereas sciences
usually develop gradually with occasional drastic paradigm shifts [1,61] .
Additionally, pseudoscientists do not usually publish their work in peer-reviewed journals. We can also check
their sources, as pseudoscientific practitioners do not always reference scholarly sources [95] .
Pseudosciences are popular because people would like to believe them; they are exciting and capture people’s
imagination, perhaps more so than mainstream science. Pseudosciences may be harmless - although arguably
some are not if they divert people from seeking effective treatment. For example, believing in untrue things is
perhaps disempowering because it prevents people from accessing the truth [2] .

How To Approach Ideas With An Open Mind


To sum up, critical thinkers must be able to approach a pseudoscientific issue with an open mind, ready to follow
the evidence and the arguments wherever they lead. Apply your skepticism and reasoning skills.
If you think an idea may be pseudoscientific, look for its sources, examine the logic of the argument, and consider
whether the author has a motive for ‘selling’ the pseudoscience, such as selling products [95] . This does work:
both critical thinking and a skeptical approach reduce people’s belief in unfounded pseudosciences [97] . There are
a few useful ideas you can apply when reading or hearing about pseudosciences.
Firstly, remember Occam’s razor: the best theory is the one that forces you to make the fewest assumptions. The
simplest explanation is often the best. Some pseudosciences ask us to believe in unfalsifiable and unmeasurable
ideas like a universal energy that can flow through crystals, and we can apply Occam’s razor to such ideas [102] .
The simpler explanation is the placebo effect, an effect based on extensive evidence whereby people’s
expectations can make them feel better.
Secondly, the burden of proof principle. This argues that when somebody asks society to believe something that
diverges from our accepted knowledge about the world, they should supply evidence to support their claims,
rather than making it society’s job to disprove them.
Thirdly, Sagan’s balance principle says that extraordinary claims require extraordinary evidence. Therefore, minor
evidence like a handful of cures or correct predictions cannot ‘prove’ a pseudoscience [102,103] .
Although many scientists deride pseudosciences, remember that all sciences have to start somewhere, and it takes
time to gather the evidence. By keeping an open mind and analyzing things rationally, you can keep abreast of
new developments while avoiding getting sucked in by nonsense.

Action Steps
1. Detective Work
Make a brief list of possible pseudosciences and use your skills to gather evidence and decide whether you think
they are real science or pseudoscience. If you need ideas, choose a couple of these:

Iridology
Mycology
Homeopathy
Neurochemistry
Geomorpholog
Macrobiotics

2. Study Skills
Devise a scientific theory within your field of expertise, and plan an investigation. This could be something work-
related, within a leisure pastime (such as sports or creative work), or something silly and fun. Whatever you
choose, aim to be thorough. It is fine if you cannot conduct the study for real. For example, if it involves time,
resources, or ethical issues.
Work through the general scientific method to hone your idea and generate something you can test. Make casual
observations, formulate a research question, narrow this to a testable hypothesis, and consider how you would
analyze the data. If you are not a statistical expert, never fear - you can always draw a graph and compare the data
visually.
Finally, consider what valid conclusions you could draw from different results. Congratulations, you have just
proved you are not a pseudoscientist!

Summary
In the anecdote at the start of the chapter, we met Marlon, who was confused by his Mom’s insistence on keeping
horse chestnuts in the corners of her apartment. She said this was a well-known way of keeping spiders out of the
home, but she could not explain why horse chestnuts put them off. This vague explanation is similar to
pseudoscience: people might believe something works, but they do not know why.
Marlon’s Mom believed the practice worked because it was traditional, and she also exhibited confirmation bias.
Even Marlon succumbed to it slightly when he reflected that he had never seen a spider in his Mom’s house.
However, less reliable than objective evidence.
A scientific approach to any idea requires observation, followed by defining a solid research question that you can
test in the real world. This kind of study does not always get done for pseudoscience. In many instances, it cannot
be done because there is no adequate control condition to compare to pseudoscientific practice. Overall, science
and pseudoscience alike provide us with ample opportunities to exercise our critical faculties.

Takeaways
1. Scientific methods and processes are the most reliable ways to explore and find out about the world.
2. However, not everything that resembles science is actual science. Mistakes and misrepresentations in the form
of pseudoscience can tempt people towards incorrect conclusions .
3. Pseudosciences persist for many reasons, including inherent biases, wishful thinking, tradition, and certain
personality traits.
4. Keep an open mind about novel ideas, but remember that some ideas are more useful than others because they
help us understand and predict the world.
AFTERWORD

Marvin lazed on the decking at his lake house, watching the fish whirling around in the clear water. His work
phone vibrated on the kitchen counter, but he let it ring. He knew the call spelled no good for his summer retreat.
Hours later, the evening drew in, and Marvin finally got around to checking his missed calls. He was surprised to
see that his bank had called him and left a message. That was unexpected. He managed to call them on the
number they left and got through to an operator straight away. The line was terrible, but the voice on the other end
sounded urgent.
“I can’t hear you,” said Marvin.
“Give me your password, Mr. Keller,” the crackly voice said.
“Of course…” Marvin duly gave the operator his password and further security details .
Apparently, there was a problem with his account, which meant he had to wire his money to a different account
urgently.
Did you guess? Scammers targeted Marvin and managed to get him to transfer all his funds to them. He did not
even notice until he returned from his lake house to double trouble: his work colleagues had realized he had
ripped them off, and he realized he had gained nothing because he had fallen for a telephone banking scam.
In this example, the protagonist was lax about checking the credentials of the person calling him. The signs were
there, particularly the unscheduled contact and urgency of transferring the funds. His lack of skepticism about the
call ended up costing him a lot.
Separating sense from nonsense is a massively difficult task, not least because potential deceptions bombard us all
the time, almost as if they were waiting in line for us to drop our guards. However, we can get closer to the truth
by applying critical thinking techniques to information we encounter each day. In summary:
Critical thinking approach : this means reasoning logically, using evidence rather than working to justify
conclusions we desire. Gathering information to argue for a predetermined conclusion is easy but wrong. With
critical thinking, we can be sure that our decisions are conscious, and deliberate and based on facts. We must be
clear about the difference between facts, opinions, and claims. We must know about the role emotions play in
human cognition. Lastly, we must seek evidence relating to purported facts, including researching the source of
and reason for any message.
Our complex minds : how our brains work can lead to blurred boundaries between truth and non-truth, or even
getting things completely wrong without even being aware of it. Humans are emotional creatures with a drive to
learn from and believe others, so, unsurprisingly, misinformation spreads. Furthermore, biases, fallacies, and
heuristics all have a significant influence on our thinking, sometimes without us ever becoming aware of it.
Scientific skepticism : this is an attitude that can help gauge the truth of claims. Be like a scientist and question
whether a claim you hear can be verified or falsified. Scientific skepticism means overcoming our natural
inclination to process information quickly and automatically, and instead stepping back, slowing down, and really
analyzing what we encounter. Skepticism means doubt, not necessarily disbelief, and it works best with an open-
minded outlook.
The media : social media and the mass media are the major sources of information for the vast majority these
days, but they vary in reporting accuracy. Some information can even be completely false, designed to lure people
in to spend money and/or time on websites run by shysters. Use media literacy techniques like lateral reading to
get a deeper understanding of the information you see in the media, rather than taking it at face value .
Deception : dishonesty is fairly widespread outside of the media, too. Most people are honest enough about the
things that matter, but we would all be wise to stay alert for the signs that people are lying to us. Faces, voices,
and body language all provide clues, but we should pay attention to what they say as well. Similarly, be alert to
the signs of fraudsters using scams like advance payment schemes.
Pseudosciences : are explanations or techniques that claim a scientific basis or approach, but they are distinct
from sciences in several ways. Science uses a cycle of observation, testing, and refinement of theories and
methods, aiming to advance knowledge in a specific area. In contrast, pseudosciences are sometimes difficult to
test in a truly scientific manner. However, cynics sometimes mislabel progressive science as pseudoscience, so we
should do our best to assess new ideas in an open-minded and skeptical manner.
In conclusion, now that you have the tools required to separate fact from fiction, make sure to do your critical
thinking as well as you can and work to develop it. Critical thinking helps you recognize and avoid harmful and
useless thought patterns. It helps you to reach better conclusions. It improves the quality of your thinking, raising
your chances of achieving your goals. Good luck!
ONE FINAL WORD FROM US

If this book has helped you in any way, we’d appreciate it if you left a review on Amazon. Reviews are the lifeblood of our business. We read every
single one and incorporate your feedback in developing future book projects.

To leave an Amazon review simply click below:

(Or go to: smarturl.it/tcter or scan the code with your camera)


CONTINUING YOUR JOURNEY

Those Who Keep Learning, Will Keep Rising In Life.


​— ​Charlie Munger (Billionaire, Investor, and Warren Buffet’s Business Partner)

The most successful people in life are those who enjoy learning and asking questions, understanding themselves and the world around them.
We created the Thinknetic Community so that like-minded individuals could share ideas and learn from each other.
It’s 100% free.
Besides, you’ll hear first about new releases and get the chance to receive discounts and freebies.
You can join us on Facebook by clicking the link below:

(Or go to www.facebook.com/groups/thinknetic or simply scan the code with your camera)


REFERENCES

1. Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills . Chantilly, Va.:
The Teaching Company.
2. Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human Reasoning In Everyday
Life . New York: The Free Press.
3. The Foundation For Critical Thinking (2019). Critical Thinking: Where To Begin . Available at:
https://www.criticalthinking.org/pages/critical-thinking-where-to-begin/796 (Accessed: 14th December
2020)
4. Taber, C.S. & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American
Journal Of Political Science , 50(3), pp. 755-769. doi: 1540-5907.2006.00214.x
5. Stanovich, K.E., West, F.R., Toplak, M.E. (2013). Myside Bias, Rational Thinking, and Intelligence.
Current Directions in Psychological Science , 22(4) pp. 259–264. doi: 10.1177/0963721413480174
6. Russell, J. A. (2003). Core affect and the psychological construction of emotion. Psychological Review ,
110 (1), 145–172. doi: 10.1037/0033-295X.110.1.145
7. Kozlowski et al Kozlowski, D., Hutchinson, M., Hurley, J., Rowley, J., Sutherland, J. (2017). The role of
emotion in clinical decision making: an integrative literature review. BMC Medical Education , 17(1),
p255. doi: 10.1186/s12909-017-1089-7
8. Rauscher, F.H., Shaw, G.L. & Ky, K.N. (1993). Music and spatial task performance. Nature , 365, p611.
doi: 10.1038/365611a0
9. Nantais, K. & Schellenberg, G.E. (1999). The Mozart effect: an artifact of preference. Psychological
Science 10(4), pp370-373. doi: 10.1111/1467-9280.00170
10. Pietschnig, J., Voracek, M., & Formann, A. K. (2010). Mozart effect–Shmozart effect: A meta-analysis.
Intelligence , 38(3), pp314–323. doi: 10.1016/j.intell.2010.03.001
11. Kahneman, D. (2011). Thinking, Fast And Slow . New York: Farrar, Straus and Giroux.
12. Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense (2nd ed.).
Belmont, CA.: Thomson/Wadsworth.
13. Frank M.C., Vul E., Johnson S.P. (2009). Development of infants' attention to faces during the first year.
Cognition , 110 (2), pp160-170. doi: 10.1016/j.cognition.2008.11.010.
14. Bower, G. H., Monteiro, K. P., & Gilligan, S. G. (1978). Emotional mood as a context for learning and
recall. Journal of Verbal Learning & Verbal Behavior , 17(5), pp573–585. doi: 10.1016/S0022-
5371(78)90348-1.
15. Bower, G. H. (1981). Mood and memory. American Psychologist, 36 (2), pp129–148. doi:
10.1037/0003-066X.36.2.129
16. Heck, P.R., Simons, D.J., Chabris, C.F. (2018) 65% of Americans believe they are above average in
intelligence: Results of two nationally representative surveys. PLoSONE, 13 (7), e0200103. doi:
10.1371/journal.pone.0200103
17. Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science , 185 ,
pp1124-1130. doi: 10.1126/science.185.4157.1124
18. Russell, J.A. (2003) Core affect and the psychological construction of emotion. Psychological Review ,
110 (1), pp145-172 doi: 10.1037/0033-295x.110.1.145
19. Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in
probability judgment. Psychological Review , 90 , 293-315. doi:10.1037/0033-295X.90.4.293
20. Yap, A. (2013) Ad Hominem Fallacies, Bias, and Testimony. Argumentation, 27 (2), pp97-109. doi:
10.1007/s10503-011-9260-5
21. Walton, D.N. (1987) The ad Hominem argument as an informal fallacy. Argumentation, 1 , pp317–331.
doi: 10.1007/BF00136781
22. Walton, D. (1999) Rethinking the Fallacy of Hasty Generalization. Argumentation, 13 , pp161–182. doi:
10.1023/A:1026497207240
23. Law, S (2006) Thinking tools: The bandwagon fallacy. Think , 4 (12), pp. 111. doi:
10.1017/S1477175600001792
24. Asch, S. E. (1956). Studies of independence and conformity: I. A minority of one against a unanimous
majority. Psychological Monographs: General and Applied , 70 (9), 1–70. doi: 10.1037/h0093718
25. Sternberg, R.J. & Halpern, D.F. (Eds.) (2020) Critical Thinking In Psychology (2nd Ed.). Cambridge,
UK: Cambridge University Press.
26. Fiol, C.M. & O'Connor, E.J. (2003). Waking up! Mindfulness in the face of bandwagons. Academy of
Management Review , 28 , pp 54-70. doi: 10.5465/AMR.2003.8925227.
27. Rosenkopf, L., Abrahamson, E. (1999). Modeling Reputational and Informational Influences in
Threshold Models of Bandwagon Innovation Diffusion. Computational & Mathematical Organization
Theory , 5 , pp361–384 doi: 10.1023/A:1009620618662.
28. Nickerson, R. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General
Psychology , 2 , pp175-220. doi: 10.1037/1089-2680.2.2.175
29. Wilson, T. D., Houston, C. E., Etling, K. M., & Brekke, N. (1996). A new look at anchoring effects:
Basic anchoring and its antecedents. Journal of Experimental Psychology: General , 125 , pp387-402.
doi: 10.1037/0096-3445.125.4.387
30. Ross, L., Greene, D., House, P. (1977) The “false consensus effect”: An egocentric bias in social
perception and attribution processes. Journal of Experimental Social Psychology , 13 (3), pp279-301.
doi: 10.1016/0022-1031(77)90049-X.
31. Marks, G., & Miller, N. (1987). Ten years of research on the false-consensus effect: An empirical and
theoretical review. Psychological Bulletin , 102 (1), 72–90. doi: 10.1037/0033-2909.102.1.72
32. Gilovich, T. (1990). Differential construal and the false consensus effect. Journal of Personality and
Social Psychology , 59 (4), pp623–634. doi: 10.1037/0022-3514.59.4.623
33. Nisbett, R. E., & Wilson, T. D. (1977). The halo effect: Evidence for unconscious alteration of
judgments. Journal of Personality and Social Psychology , 35 (4), pp250–256. doi: 10.1037/0022-
3514.35.4.250
34. Baddeley, A (1997). Human Memory: Theory And Practice . (Revised Ed.). Hove, UK: Psychology
Press.
35. Tulving, E. (1983). Elements Of Episodic Memory . New York: Oxford University Press.
36. Festinger, L. (1957). A Theory Of Cognitive Dissonance . Stanford, CA: Stanford University Press.
37. Miller, M.K., Clark , J.D., Jehle, A. (2015) Cognitive Dissonance Theory (Festinger). In: The Blackwell
Encyclopaedia Of Sociology. doi.org/10.1002/9781405165518.wbeosc058.pub2
38. Little, R.J., D'Agostino, R., Cohen, M.L., Dickersin, K., Emerson, S.S., Farrar, J.T., Frangakis, C.,
Hogan, J.W., Molenberghs, G., Murphy, S.A., Neaton, J.D., Rotnitzky, A., Scharfstein, D., Shih, W.J.,
Siegel, J.P., Stern, H. (2012) The prevention and treatment of missing data in clinical trials. New
England Journal Of Medicine , 367 (14), pp1355-60. doi: 10.1056/NEJMsr1203730
39. Ayton, P., & Fischer, I. (2004) The hot hand fallacy and the gambler’s fallacy: Two faces of subjective
randomness? Memory & Cognition , 32 , pp1369–1378. doi: 10.3758/BF03206327
40. The Editors of Encyclopaedia Britannica (2016). Verifiability Principle . Encyclopædia Britannica.
Available at https://www.britannica.com/topic/verifiability-principle (Accessed January 15, 2021)
41. American Institute Of Physics (2018). Science Strategies Chart Course for Detecting Life on Other
Worlds https://www.aip.org/fyi/ 2018/science-strategies-chart-course-detecting-life-other-worlds
(Accessed 1 February 2021)
42. Ayer, A. J. (1936). Language, Truth, And Logic . London, UK: V. Gollancz.
43. Shankar, S. (2017) Verifiability And Falsifiability As Parameters For Scientific Methodology.
International Journal of Education & Multidisciplinary Studies , 7 (2), pp130-137. doi:
10.21013/jems.v7.n2.p10
44. Popper, K. (1963) Conjectures And Refutations: The Growth Of Scientific Knowledge . London, UK:
Routledge & Kegan Paul.
45. Neyman, J.; Pearson, E. S. (1933). The testing of statistical hypotheses in relation to probabilities a
priori. Mathematical Proceedings of the Cambridge Philosophical Society , 29 (4), pp492–510. Doi:
10.1017/s030500410001152x.
46. Schupbach, J., & Sprenger, J. (2011). The Logic of Explanatory Power. Philosophy of Science , 78 (1),
pp105-127. doi:10.1086/658111
47. Arditti, J., Elliott, J., Kitching, I. & Wasserthal, L. (2012). ‘Good Heavens what insect can suck it’–
Charles Darwin, Angraecum sesquipedale and Xanthopan morganii praedicta. Botanical Journal of the
Linnean Society , 169 , pp403–432. doi: 10.1111/j.1095-8339.2012.01250.x.
48. Grafman, J. (2000) Conceptualizing functional neuroplasticity. Journal of Communication Disorders , 33
(4), 345-356, doi: 10.1016/S0021-9924(00)00030-7.
49. Liu, D.W.C. (2012) Science Denial and the Science Classroom. CBE - Life Sciences Education , 11 (2)
pp129-134.
50. Sagan C. (1987) The Burden Of Skepticism. Skeptical Inquirer , 12 (1)
https://skepticalinquirer.org/1987/10/the-burden-of-skepticism/
51. Dwyer, C. (2017). Critical Thinking: Conceptual Perspectives and Practical Guidelines . Cambridge:
Cambridge University Press. doi:10.1017/9781316537411
52. Truzzi, M. (1987) On Pseudo-Skepticism. Zetetic Scholar , 12/13 , pp3-4.
53. Snelson, J.S. (1992) The Ideological Immune System: Resistance to New Ideas in Science. Skeptic , 1
(4).
54. Çakici, D., Metacognitive Awareness and Critical Thinking Abilities of Pre-Service EFL Teachers,
Journal of Education and Learning , 7 (5) pp116-129. doi: 10.5539/jel.v7n5p116
55. Flavell, J. (1979). Metacognition and Cognitive Monitoring: A New Area of Cognitive-Developmental
Inquiry. American Psychologist , 34 , 906-911.
56. Schraw, G. (1998) Promoting general metacognitive awareness. Instructional Science, 26 , pp113–125.
doi: 10.1023/A:1003044231033
57. Paul, R & and Elder, L. (2013) Critical Thinking: Intellectual Standards Essential to Reasoning Well
Within Every Domain of Human Thought, Part Two. Journal Of Developmental Education , 37 (1).
58. Wilterdink, N. (2002). The sociogenesis of postmodernism. European Journal of Sociology , 43 (2),
pp190-216.
59. Duignan, B. (2020) Postmodernism. Encyclopedia Britannica ,
https://www.britannica.com/topic/postmodernism-philosophy. (Accessed 22 January 2021) .
60. Dennett, D.C. (2013). On Wieseltier V. Pinker in The New Republic: Let's Start With A Respect For
Truth. Edge , https://www.edge.org/conversation/daniel_c_dennett-dennett-on-wieseltier-v-pinker-in-the-
new-republic. (Accessed 22 January 2021) .
61. Kuhn, T. S. (1970). The structure of scientific revolutions (2nd Ed.). University of Chicago Press:
Chicago.
62. Watson, C.A. (2018) Information Literacy in a Fake/False News World: An Overview of the
Characteristics of Fake News and its Historical Development. International Journal of Legal
Information , 46 (2), pp. 93-96.
63. McDermott, R (2019) Psychological Underpinnings of Post-Truth in Political Beliefs. Political Science
& Politics , 52 (2), pp218-222.D
64. Vosoughi, S., Roy, D., Aral, S. (2018) The spread of true and false news online. Science , 369 , pp1146-
1151 doi: 10.1126/science.aap9559
65. Aral, S. & Van Alstyne, M.W. (2011). The Diversity-Bandwidth Tradeoff. American Journal of
Sociology , 117 (1), doi: 0.2139/ssrn.958158
66. Itti, L. & Baldi, P. (2009). Bayesian surprise attracts human attention, Vision Research , 49 (10), pp1295-
1306. doi: 10.1016/j.visres.2008.09.007.
67. Vuilleumier P. (2005). How brains beware: neural mechanisms of emotional attention. Trends In
Cognitive Science , 9 (12), pp585-94. Doi: 10.1016/j.tics.2005.10.011
68. Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and
Its Correction: Continued Influence and Successful Debiasing. Psychological Science in the Public
Interest, 13 (3), 106–131. doi: 10.1177/1529100612451018
69. Osborne, C.L. (2018). Programming to Promote Information Literacy in the Era of Fake News.
International Journal of Legal Information , 46 (2), pp101-109. doi: 10.1017/jli.2018.21
70. LaGarde, J. & Hudgins, D. (2018) Fact Vs. Fiction: Teaching Critical Thinking Skills in the Age of Fake
News . International Society for Technology in Education.
71. Niedringhaus, K.L (2018). Information Literacy in a Fake/False News World: Why Does it Matter and
How Does it Spread? International Journal of Legal Information , 46 (2), pp97-100.
doi:10.1017/jli.2018.26
72. Murch, S.H., Anthony, A., Casson, D.H., Malik, M., Berelowitz, M., Dhillon, A.P., Thomson, M.A.,
Valentine, A., Davies, S.E., Walker-Smith, J.A. (2004) Retraction of an interpretation. Lancet . 363
(9411):750. doi: 10.1016/S0140-6736(04)15715-2. Erratum for: Lancet . 1998 Feb 28;351(9103):637-
41.
73. Shearer, E. & Gottfried, J. (2017). News Use Across Social Media Platforms 2017, Pew Research Center
. https://www.journalism.org/2017/09/07/news-use-across-social-media-platforms-2017/
74. Blakeslee, Sarah (2004) "The CRAAP Test," LOEX Quarterly , 31 (3 ). Available at:
commons.emich.edu/loexquarterly/vol31/iss3/4
75. Fielding, J.A. (2019) Rethinking CRAAP: Getting students thinking like fact-checkers in evaluating web
sources. College & Research Libraries News, 80 (11), pp.620-622. doi: 10.5860/crln.80.11.620
76. Wineburg, S. & Mcgrew, S. (2017) Lateral Reading: Reading Less and Learning More When Evaluating
Digital Information. Stanford History Education Group Working Paper No. 2017-A1 , Available at
http://dx.doi.org/10.2139/ssrn.3048994
77. Edelman trust barometer 2021. Available at https://www.edelman.com/sites/g/files/aatuss191/files/2021-
01/2021-edelman-trust-barometer.pdf
78. Society of Professional Journalists (2014). SPJ Code Of Ethics . https://www.spj.org/ethicscode.asp
[accessed 12 Feb 2021]
79. Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And Marriage . New York:
W.W. Norton.
80. Vrij, A. (2004). Guidelines to catch a liar. In P. Granhag & L. Strömwall (Eds.), The Detection of
Deception in Forensic Contexts (pp. 287-314). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511490071.013
81. DePaulo, B., & Morris, W. (2004). Discerning lies from truths: Behavioural cues to deception and the
indirect pathway of intuition. In P. Granhag & L. Strömwall (Eds.), The Detection of Deception in
Forensic Contexts (pp. 15-40). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511490071.002
82. Arciuli, J., Mallard, D., & Villar, G. (2010). “Um, I can tell you're lying”: Linguistic markers of
deception versus truth-telling in speech. Applied Psycholinguistics , 31 (3), pp397-411.
doi:10.1017/S0142716410000044
83. Rockwell, P., Buller, D., & Burgoon, J. (1997). Measurement of deceptive voices: Comparing acoustic
and perceptual data. Applied Psycholinguistics, 18 (4), 471-484. doi:10.1017/S0142716400010948
84. Bull, R. (2004). Training to detect deception from behavioural cues: Attempts and problems. In P.
Granhag & L. Strömwall (Eds.), The Detection of Deception in Forensic Contexts (pp. 251-268).
Cambridge: Cambridge University Press. doi:10.1017/CBO9780511490071.011
85. Knapp, M. (2006). Lying and Deception in Close Relationships. In A. Vangelisti & D. Perlman (Eds.),
The Cambridge Handbook of Personal Relationships , pp. 517-532). Cambridge: Cambridge University
Press. doi:10.1017/CBO9780511606632.029
86. Levine, T.R. (2014). Truth-default Theory (TDT): A Theory of Human Deception and Deception
Detection. Journal of Language and Social Psychology, 33 , pp378-92. doi:
10.1177/0261927X14535916
87. Serota, K.B., Levine, T. & Boster, F.J. (2010). The Prevalence of Lying in America: Three Studies of
Self-Reported Lies. Human Communication Research, 36 , pp2-25
88. Levine, T.R. (2015). New and Improved Accuracy Findings in Deception Detection Research. Current
Opinion in Psychology , 6 , pp1-5 doi: 10.1016/j.copsyc.2015.03.003.
89. Clough, J. (2010). Fraud. In Principles of Cybercrime (pp. 183-220). Cambridge: Cambridge University
Press. doi:10.1017/CBO9780511845123.008
90. Hancock, P. (2015). The Psychology of Deception. In Hoax Springs Eternal: The Psychology of
Cognitive Deception (pp. 61-71). Cambridge: Cambridge University Press.
91. Federal Trade Commission (2020). How To Avoid A Scam . https://www.consumer.ftc.gov/articles/how-
avoid-scam [Accessed 7 February 2021]
92. Citizens Advice (2019) Check If Something Might Be A Scam.
https://www.citizensadvice.org.uk/consumer/scams/ check-if-something-might-be-a-scam/[Accessed 7
February 2021]
93. NSW Government. Misleading Representations And Deceptive Conduct .
https://www.fairtrading.nsw.gov.au/buying-products-and-services/advertising-and-pricing/misleading-or-
deceptive-conduct [Accessed 13 February 2021]
94. Evon, D. (2015) Natural repellent for spiders? Snopes.com. Available at https://www.snopes.com/fact-
check/walnut-and-spiders/#:~:text=Lastly ,%20the%20idea%20that%20spiders%20are% [Accesed 6
February 2021]
95. Harrington, M. (2020). The Varieties of Scientific Experience . In The Design of Experiments in
Neuroscience (pp. 1-12). Cambridge: Cambridge University Press. doi:10.1017/9781108592468.002
96. Gauch, H. (2012). Scientific Method in Brief . Cambridge: Cambridge University Press.
doi:10.1017/CBO9781139095082
97. Bensley, D. (2020). Critical Thinking and the Rejection of Unsubstantiated Claims . In R. Sternberg &
D. Halpern (Eds.), Critical Thinking in Psychology ( pp. 68-102). Cambridge: Cambridge University
Press. doi:10.1017/9781108684354.005
98. Narlikar, J. (2005). Astronomy, Pseudoscience, and Rational Thinking. Highlights of Astronomy , 13 ,
1052-1054. doi:10.1017/S1539299600018116
99. Percy, J., & Pasachoff, J. (2005). Astronomical pseudosciences in North America . In J. Pasachoff & J.
Percy (Eds.), Teaching and Learning Astronomy: Effective Strategies for Educators Worldwide (pp. 172-
176). Cambridge: Cambridge University Press. doi:10.1017/CBO9780511614880.026
100. Landrum, A.R. & Olshansky, A. (2019) The role of conspiracy mentality in denial of science and
susceptibility to viral deception about science. Politics and the Life Sciences , 38 (2), pp193-209
101. Lakatos, I. (1978). Introduction: Science and pseudoscienc e. In J. Worrall & G. Currie (Eds.), The
Methodology of Scientific Research Programmes: Philosophical Papers (pp. 1-7). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511621123.002
102. Bridgstock, M. (2009). Modern skepticism . In Beyond Belief: Skepticism, Science and the Paranormal
(pp. 86-110). Cambridge: Cambridge University Press. doi:10.1017/CBO9780511691676.006
103. Sagan, C. (1997). The Demon-Haunted World . London: Headline.
DISCLAIMER

Thе іnfоrmаtіоn соntаіnеd іn this book аnd іtѕ соmроnеntѕ, іѕ meant to ѕеrvе аѕ a соmрrеhеnѕіvе соllесtіоn оf
ѕtrаtеgіеѕ thаt thе аuthоr оf thіѕ bооk hаѕ dоnе rеѕеаrсh аbоut. Summаrіеѕ, ѕtrаtеgіеѕ, tірѕ аnd trіс kѕ аrе оnlу
rесоmmеndаtіоnѕ bу thе аuthоr, аnd rеаdіng thіѕ bооk wіll nоt guаrаntее thаt оnе’ѕ rеѕultѕ wіll еxасtlу mіrrоr thе
аuthоr’ѕ rеѕultѕ.
Thе аuthоr оf thіѕ bооk hаѕ mаdе аll rеаѕоnаblе еffоrtѕ tо рrоvіdе сurrеnt аnd ассurаtе іnfоrmаtіоn fоr thе rеаdеrѕ
оf thіѕ bооk. Thе аuthоr аnd іtѕ аѕѕосіаtеѕ wіll nоt bе held liable for аnу unіntеntіоnаl еrrоrѕ оr оmіѕѕіоnѕ thаt
mау bе fоund.
Thе mаtеrіаl іn thе bооk mау іnсludе іnfоrmаtіоn by third раrtіеѕ. Third pаrtу mаtеrіаlѕ соmрrіѕе оf орiniоnѕ
еxрrеѕѕеd bу thеіr оwnеrѕ. Aѕ ѕuсh, thе аuthоr оf thіѕ bооk dоеѕ nоt аѕѕumе rеѕроnѕіbіlіtу оr lіаbіlіtу fоr аnу
thіrd раrtу mаtеrіаl оr оріnіоnѕ.
Thе рublісаtіоn оf thіrd раrtу mаtеrіаl dоеѕ nоt соnѕtіtutе thе аuthоr’ѕ guаrаntее оf аnу іnfоrmаtіоn, рrоduсtѕ,
ѕеrvісеѕ, оr оріnіоnѕ соntаіnеd wіthіn third раrtу mаtеrіаl. Uѕе оf thіrd раrtу mаtеrіаl dоеѕ nоt guаrаntее thаt
уоur rеѕultѕ wіll mіrrоr our rеѕultѕ. Publісаtіоn оf ѕuсh thіrd раrtу mаtеrіаl іѕ ѕіmрlу a rесоmmеndаtіоn аnd
еxрrеѕѕіоn оf thе аuthоr’ѕ оwn оріnіоn оf thаt mаtеrіаl.
Whеthеr bесаuѕе оf thе рrоgrеѕѕiоn оf thе Intеrnеt, оr thе unfоrеѕееn сhаngеѕ іn соmраnу роlісу аnd еdіtоrіаl
ѕubmіѕѕіоn guіdеlіnеѕ, whаt іѕ ѕtаtеd аѕ fасt аt thе tіmе оf thіѕ wrіtіng mау bесоmе оutdаtеd оr іnаррlісаblе lаtеr.
Thіѕ book іѕ соруright ©2021 bу Thinknetic with all rіghtѕ rеѕеrvеd. It іѕ illegal to rеdіѕtrіbutе, сору, оr сrеаtе
dеrіvаtіvе wоrkѕ frоm thіѕ bооk whоlе оr іn раrtѕ. Nо раrtѕ оf thіѕ rероrt mау bе rерrоduсеd оr rеtrаnѕmіttеd іn
аnу fоrmѕ whаtѕоеvеr wіthоut thе wrіttеn еxрrеѕѕеd аnd ѕіgnеd реrmіѕѕіоn frоm thе author.

You might also like