Professional Documents
Culture Documents
The Critical Thinking Effect_ Uncover the Secrets of Thinking Critically and Telling Fact From Fiction (Critical Thinking Logic Mastery)
The Critical Thinking Effect_ Uncover the Secrets of Thinking Critically and Telling Fact From Fiction (Critical Thinking Logic Mastery)
THINKNETIC
Did You Know That 93% Of CEOs Agree That This Skill Is More Important Than Your College Degree?
How to shortcut the famous Malcom Gladwell "10,000 Hours Rule" to become an expert critical thinker, fast
What a WW2 pilot and the people of Romania can teach you about critical thinking - this is the KEY to not making huge mistakes
Actionable, easy exercises to drill home every point covered in the novel. You won't "read and forget" this book
Our educational system simply doesn't teach us how to think...
...and it's unlikely this is information you've ever learned anywhere else - until now.
A glimpse into what you'll discover inside:
If your thinking is flawed and what it takes to fix it (the solutions are included)
Tried and true hacks to elevate your rationality and change your life for the better
Enlightening principles to guide your thoughts and actions (gathered from the wisest men of all time)
Introduction
Afterword
One Final Word From Us
Continuing Your Journey
References
Disclaimer
INTRODUCTION
Have you ever wondered why there seems to be so much misinformation out there? With fake news stories
reaching a hundred times more users on Twitter than real news stories, perhaps it is no wonder we have a problem
separating the facts from nonsense. That leads us to another question: who comes up with the facts, and how do
they know what is real? Why do some people believe official or scientific explanations, whereas others prefer to
believe alternatives?
Is deception an unavoidable part of human existence? Can we develop techniques to cope with this influx of
potential deception and get to the truth?
We all learned critical thinking at school and college, but how often do we apply it in everyday life? Even more
importantly, how often do we interact with others who fail to apply a critical thinking approach to their work and
personal lives? Many people happily believe in nonsensical ideas, whether these be innocent misconceptions or
(in some cases) dangerous misinformation. By developing your critical thinking skills even further, you can learn
to deal with these difficult people and present a coherent case for why they should perhaps research their views
further.
A critical thinking mindset also helps us be more efficient and effective at work, enabling us to focus on relevant
information and discard unconnected or inaccurate information. This leaves more time for the things and people
we enjoy. If you strive for balance, it is worth taking the time to hone these skills and learn to apply them in a
broader range of situations.
Many of us are perfectionists, and there is nothing wrong with that. We know that perfectionists deliver better
results due to their high standards, but why not streamline your perfectionism and do the same or better with less
effort? That is where the techniques in this book come in.
This book walks you through the details of using critical thinking, to identify the truth from non-truth. It contains
specific examples, illustrative stories, thorough explanations, and practice exercises; all backed up by a range of
scientific studies conducted by experts in their fields.
Each chapter is self-contained and logically organized, meaning you do not have to read it all in one sitting to
make sense of the contents. You will understand how to discern the truth in a variety of contexts, from online
news to scientific claims, to personal interactions. You will learn more about the scientific method, including
scientific skepticism and how to tell science from non-science.
You will also find out how your mind might block you from figuring out the truth with a discussion of common
biases, heuristics, and fallacies, including ways to overcome them. Knowing more about these will also help you
deal more effectively with difficult people, including those trying to sell you phony ideas or dodgy products.
By learning more about how to sort the truth from the lies, you can reach better conclusions and make better
decisions. You can also help others who do not yet have this knowledge to feel more confident and confident in
themselves.
This book’s author is Camilla J. Croucher. She graduated with a Ph.D. in cognitive science from the University of
Cambridge in 2007 and then completed a postdoctoral fellowship at City, University of London. Her academic
specialisms include emotional memory, visual processing, and judgment and decision-making. She is outstanding
at statistics but does not want to bore you with that. She has also worked in retail, clinical research management,
e-commerce, learning support, and (of course) freelance writing. Her interest in the illusions created by the eye
and mind remained constant across each of these roles, and she very much enjoyed composing this text for
you.The book is not a definitive guide on thinking critically, and it is not a philosophy text. Instead, it summarizes
reliable research findings and experts’ opinions and suggests techniques and practice exercises that you can use in
your daily life. So, are you ready to explore what it means to be a critical thinker?
1
hancellor Sham Anderson took a deep breath as she hung up the phone. Marvin Keller, the Chair of the
C University Finance Committee, was not answering her calls. That morning's big news was the resignation of
the Director of Loofberger Inc., sparking the company's stock’s sudden plummet.
Naturally, Anderson now had serious concerns about the University's multi-million dollar investment in that very
company. The decision to invest in a new corporation touting a completely novel product was risky, of course.
However, the University had the funds despite huge pressure from the science and technology faculty.
Building a new space research center would allow the University to build probes to look for life on various moons
within our solar system. This would not be possible without a vast amount of money .
The company performed excellently from the outset, suggesting a low risk. The Finance Committee quickly
decided the kudos was worth it. They would invest in Loofberger.
Soon, the first hints emerged that Loofberger was in trouble. The product was not selling as expected. The share
price slowly declined. Nevertheless, Keller assured the Committee that this was a blip and that things would pick
up soon.
Sham startled as her cell phone rang: an unknown number. A somber voice confirmed that Loofberger had gone
bust: the University lost its entire investment.
Hindsight revealed several clues.
Marvin Keller had failed to disclose that his brother-in-law was also Loofberger's Director. As a lifelong friend as
well as a colleague, Anderson had not thought to question his recommendation. A few months after finalizing the
deal, Keller took an extended sabbatical to work on special projects at his lake house.
What about Loofberger's initial promising market data? It turned out to be too good to be true—a fairly clever
fake.
The University never got its space center, and many of those staff relocated to other, less embarrassing
organizations. Anderson’s peers advised her to take early retirement, which she did. The student body voted to
paint the canteen ceiling black in memory of the loss, and little paint chips flaked off, making it look like the night
sky.
The finance expert in this scenario had questionable motives, but nobody had thought to question them. Expert
advice is often trustworthy, but we should not take everything at face value. Question how you know that source
is an expert and how they know what they know.
Making complex decisions is difficult when we have limited information, but it is still important to investigate its
source and content by gathering evidence to support our decisions and conclusions.
Critical thinking is a powerful approach that can help you to make better decisions. Critical thinkers do not
passively receive information. Instead, they apply the rules of logic and their own experience to interpret the
messages they hear and see properly [1] .
Facts
What exactly is a fact? More importantly, how do you know something is a fact? What if it is merely an opinion
or a claim?
A fact is a piece of information that we can verify. We can observe it, or we can find out it is true from a reliable
source. For example, the atomic number of carbon is six; the US Civil War took place between 1861 and 1865;
Armstrong was the first to walk on the Moon. These are all facts.
Without diving deep into philosophy, we can note here that ‘truth’ is an abstract idea. Nobody can say for certain
what is real (or fake, come to that). Later, when we look at scientific skepticism, we will examine this in more
detail. For now, let’s assume that truth can exist and that critical thinking at least brings us closer to it.
As a critical thinker, you should inspect any so-called fact you encounter. How do you know it is a fact and not an
assumption? Try to verify the ‘fact’ yourself; it may be an assumption if it cannot be verified. On investigating,
you may find the assumption is incorrect.
In some cases, assumptions are the best we can do. For example, if you were designing a novel product, at first,
you might assume that it would appeal to customers who buy related products. Later, you could gain evidence
using market research.
When you investigate a given fact, you might find that it is outdated or even a total misconception. Scrutinize
facts, and learn to recognize the good facts and reject the bad ones.
Opinions
Opinions may resemble facts, but they are subjective judgments. People often misrepresent opinions as facts,
perhaps because strongly held opinions may even feel factual. Opinions are always evaluative or comparative,
even if they use the same form as a fact by stating that something ‘is’ something. Saying that something is the best
must, therefore, be an opinion.
Take this statement :
“Joseph Bloggs is the best downhill skier because they have won the most gold medals.”
This sentence is an opinion based on a fact. You can verify or falsify the fact that they won most medals by
reading medals tables. The opinion that such a fact makes Joseph Bloggs the best downhill skier cannot be
verified: it is somebody’s perspective.
A new skier may be the best, even when they have not won anything yet. They might be able to beat Joseph
Bloggs in every race, but if medal count is the best measure of skiing ability, the new skier cannot be said to be
the best.
Our motivations, attitudes, and emotional states have huge effects on our opinions [2,4] . This renders opinions
vulnerable to all sorts of biases; not surprisingly, two people with identical information can very easily hold
opposite opinions. Of course, opinions can change completely over time and need not be based on facts at all.
Claims
Like opinions, claims are often wrongly presented as facts. Claims may be factual, but the definition of claim is an
assertion presented without proof. Therefore, distinguishing claims from facts is easy; you just need to check
whether the source supplies any evidence for the claim.
Claims can be implied rather than stated. ‘Before and after’ photos in beauty adverts are a good example. The
adverts may or may not overtly claim that the treatment improves the skin, but the skin certainly looks healthier in
the ‘after’ photo.
Companies produce adverts to make viewers spend money rather than showing them the truth, resulting in
advertisers presenting claims as facts. But claims crop up in the wild, too.
Conspiracists claim that mankind did not land on the Moon in 1969, but NASA faked the mission using camera
tricks in a television studio. We can say this is a claim because there is no evidence of the proposed fakery.
A fake Moon landing would entail faking a lot of evidence. Fake technical data and fake filmed footage are only
the beginning. NASA would have had to have persuaded their entire staff to give fake testimony, not to mention
fake paperwork.
Evidence
It is not just conspiracy nuts who persist even when faced with overwhelming evidence against their beliefs [2] .
We all do it. At times, we are all guilty of ignoring or misunderstanding evidence. This leads us to an important
question: what exactly is evidence, and how should we use it?
Evidence is an everyday term, but as critical thinkers, we need a more technical definition. Evidence refers to a
body of information that supports a given position.
We typically use evidence to prove or disprove a belief or decide whether a judgment or opinion is valid. Of
course, you need evidence from different sources.
A good body of evidence comes from multiple reliable sources. Imagine overhearing a conversation at a party.
Somebody claims that ‘investments are a great way to make money.’ A successful investor is listening; he nods
enthusiastically and starts bragging about the huge profits he has made. Wouldn’t you want to hear the other side?
The more evidence supports a conclusion, the more likely that conclusion is to be true. You might collect evidence
from pre-existing sources or decide to gather your own.
Picture a range of experts who are interested in why people fall into problem gambling. The medic does not agree
that sociology surveys are the best way to research this, but the sociology professor thinks they are the only way
that makes sense.
However, the two researchers would examine different aspects of addiction. The medic in this example decides to
look at physical differences in the bodies and brains of addicts and non-addicts; perhaps pre-existing variation
predicts who can gamble casually without becoming addicted. In contrast, the sociologist wants to look at
socioeconomic factors like gamblers’ family situations, housing issues, and poverty.
The gambling study could involve neuroscience, interviews with gamblers, big data science, and more, in addition
to surveys and clinical studies. All these approaches are helpful because they look at the problem at different
levels. The resulting body of evidence, taken together and processed according to good logic, could generate more
robust data than the medic or the professor alone. The group can investigate all potential causes of gambling and
compare how well all the different factors predict who becomes a problem gambler.
In conclusion, uncertainty is a good thing because it drives us to examine problems in more depth. You can never
gather all the facts or examine all the evidence. The best you can do is test your ideas and beliefs and improve
them as you go along, based on a wide range of evidence.
Source Of Message
Firstly, find out about the source. Sources are individuals or organizations, and the following advice applies to
both.
A source may be an expert on some topics and naive about others. Sources may be biased, have special interests
in certain topics, or pet theories. They may be more or less reliable, more or less trustworthy. Think about the
following aspects of the source:
Is it an academic or government publication? We have to assume these are more trustworthy than commentators.
This is because their vested interest lies in providing accurate information for the population, whereas
commentators’ motivation is more variable .
Is the source paid (or rewarded in some other way) for conveying the message? Publishers can and do pay experts
to communicate specific information.
Where do they get their information? Is it a primary or secondary source? Secondary sources can misquote
primary sources. They might even treat other secondary sources as if they were primary sources. This magnifies
errors and misconceptions. Find the original information if you want to assess it fairly.
What does your expertise tell you? If the source is somebody you know, perhaps you know that they make
outlandish claims quite often. This could factor into your assessment.
When analyzing messages, especially from people you know, remember that people’s reasoning skills vary. The
source may not be aware of all the aspects just described, and they may feel that they have made a very good case.
Perhaps with a good debate, you can help them to improve.
At times, we all forget our deep-seated assumptions and motivations. Do not forget that critical thinking takes
practice.
Purpose Of Message
Next, examine why the source composed the message. Knowing a message’s purpose may alert you to possible
distortions and half-truths. What was their real motivation ?
Here, you need to view the message’s fine details. If it is on a website, what kind of website? For example,
somebody’s private blog has a different purpose from a government website. See whether they have declared any
interest in the products or topics they discuss; like influencers, blog writers are often given ‘freebies’ in exchange
for promoting the product.
A message might not be an obvious advert, but still be a promotional text. For example, companies often feature
blogs about their products and services; you would not necessarily take these texts at face value. Instead, think
about what interest the company might have in the topic: web traffic, affiliate links, direct purchases, or simply to
get you reading more about pet insurance.
People make persuasive messages for many reasons, and they can be subtle. Analyze the language to detect
whether the message might be covert persuasion rather than unbiased information. Persuasive texts may feature
many adjectives and adverbs, chatty language, and high-school rhetorical devices like alliteration and the ‘rule of
three.’
Word choices also reveal the author or speaker’s biases and opinions. Say you are reading reviews of a scientific
book about climate change. One reviewer refers to the ‘climate scare,’ whereas the other calls it the ‘climate
emergency.’ They have a different opinion, but in the context, both phrases mean the same thing .
Another aspect of purpose is that the source may prefer one conclusion or decision from the outset. They might
then filter out and distort the evidence to support the position they have already chosen. You can tackle this issue
by using alternative sources to research the topic and filling in those gaps yourself.
Field Of Work
As well as the source’s motivation and the message’s purpose, you must understand at least something about their
field of work. This is even more important if it is not your specialty. You need to get to grips with the basics.
Firstly, what are the fundamental goals? Imagine a hospital where radiographers and nurses work together to
produce and analyze magnetic resonance images. Radiographers aim to produce the best images possible, whereas
nurses aim to keep patients comfortable and well-informed about the procedure. Sometimes these goals might
clash since the scanning procedure is uncomfortable and noisy. Specialist staff at all workplaces need to work
together in this way to be effective.
Similarly, to assess the truth or falsehood of a message, you must understand the sphere the source of the message
works in. This contextual information enables you to judge the message on its own merits. Further, there is no
point judging the quality of a radiographer’s work in the same way you would judge nursing care .
Secondly, what basic concepts or assumptions does the source employ? Individuals may not even be aware of
their basic assumptions, but you, as a critical thinker, should be able to discern them.
In everyday life, a basic assumption might be that when you enter a table service restaurant, you wait in line, and
then somebody shows you to a table. You do not have to ask somebody what to do; you just know. Similarly,
physicists assume that light’s speed is a universal constant; they do not attempt to measure it in every experiment.
Finally, what kinds of data do they use to expand their knowledge and inform their decisions? Whether you agree
with the specific methods or not, try to assess them fairly rather than from a prejudiced position. Be flexible yet
rigorous, like a scientist. Research the message behind the message you receive, and put your critical thinking
skills to good use.
Use critical thinking to evaluate your chosen fact systematically. Use these questions as guidance:
Feel free to ask any other relevant questions you can think of, based on what we have looked at in this chapter.
Now that you have done this once, you have a framework for assessing messages you receive using critical
thinking.
2. Observational Study
Firstly, visualize a person you think has good critical thinking skills. Write a few notes about them using the
questions below, or make a mind map.
What kind of person are they? What have they said or done that makes you think they are great at critical
thinking? What outcomes do they produce?
Examine the evidence you have written down, and conclude whether this person is a good critical thinker. Perhaps
bring this exercise to mind next time you speak to them or witness their critical thinking, and make a few more
observations.
Now repeat the same process for somebody you think has poor critical thinking skills, including what makes you
think they are bad at critical thinking. Put your notes or mind maps side by side and compare them.
This exercise will help you focus on the good (or bad) critical thinkers’ traits and behaviors. It also starts you
thinking about the real-world applications of critical thinking .
Summary
In the story at the beginning, the University relied on its staff to disclose conflicts of interest, and they trusted the
market data that the company reported. However, multiple factors, including misplaced trust in Keller, led them to
invest in a failing company.
A poor decision cost the University more than just money. Why? Could this have been prevented if the Finance
Committee had applied what we had learned in this chapter? Perhaps.
Emotions played a role in the investment: the desire for success, trust in Keller. They appraised the company’s
success incorrectly due to inadequate evidence (they relied on the market data). Keller, the investment
recommendation source, turned out to be unreliable due to having a personal interest in the company.
In the story, the University did not have all the information needed to make the correct decision. No doubt, you
will have been in similar situations yourself. Hopefully, the techniques covered so far have equipped you with
more tools to deal with information you encounter in the future.
Apart from features of the information we receive, what else keeps us from getting to the truth? The answer is
complex, and we will delve into it very soon in the next chapter .
Takeaways
1. Critical thinkers must distinguish between facts, opinions, claims, and evidence.
2. You should be realistic and even humble about your knowledge. However, pairing logic with your own
experience is a key part of thinking critically.
3. Remember to assess the author and their motivation, as well as the message.
4. Use multiple reliable sources, including other people, to help you reason towards better conclusions and
decisions.
2
om! Dad! I need to speak to you!” the kid yelled. He had just got back from his first day at grade school,
“M and he had serious beef with his parents.
“What is it?” asked the concerned parents.
“The other kids all laughed at me.”
A sad tale of juvenile bullying, you might think. Yes, but there was more to it. The kid had started school with
something fairly crucial missing from his social life.
His parents were overjoyed when he was born. As high achievers themselves, they wanted their children to do
well in life.
The kid’s father had heard about an interesting research study. He spoke with his spouse, and they both agreed it
could not harm their child .
The study was the famous Mozart Effect. First published in the early 1990s, this experiment indicated that
students who listened to Mozart did better on certain cognitive tests than those who did not listen to Mozart [8] .
The students performed as though their IQ was 8-9 points higher than those who listened to a relaxation tape or
silence. Furthermore, a prestigious scientific journal published the study.
This got parents, as well as scientists, very excited. Everybody wanted to grab those extra IQ points for their
child. There may even have been a boom in baby headphones’ sales and Best Of Mozart CDs (this was the 1990s,
remember).
Our family took this to an extreme, however. The kid had passed unnoticed through kindergarten, but by grade
school, his deficit was apparent. Shockingly, he had never listened to anything other than Mozart.
More, his test scores were average at best, and he was the victim of several bullying incidents within the first few
weeks of school.
That was when his Mom decided to investigate further.
Scientists found the Mozart Effect very hard to replicate, but they kept trying. More often than not, Mozart
listeners performed about as well as those who listened to different music or silence [9,10] .
The kid's Mom also found out that the cognitive enhancement effect was small and probably only lasted a while
after the music finished — anything mildly stimulating made people do a bit better on the tests.
What she regretted, though, was naming her son Wolfgang.
With the Mozart effect, one experimental study became so well-known that people did not even notice the
subsequent studies. Other studies were less dramatic and therefore did not grab the parent’s attention.
Is Mozart special? In a musical sense, of course. But there is probably not a special Mozart module in the brain
that switches on superior learning processes.
The failure to replicate the Mozart Effect suggests that the original effect was due to general characteristics of the
music, like complexity or interestingness. Aspects of the experimental situation might also have led to these
seemingly impressive results [9,10] .
Recent analysis suggests that scientists published more ‘Mozart-positive’ results due to publication bias. This is
similar to confirmation bias, which we will look at in detail in this chapter.
Our brains construct our perceptions [1] and memories, so we need to constantly evaluate and question our ideas.
Our brains construct an impression of a three-dimensional world based on a two-dimensional projection on the
retina. We perceive three dimensions even in two-dimensional drawings in a way that feels automatic [11] .
Similarly, the first idea that comes to mind from memory could easily result from our cognitive processes rather
than being a true reflection or record of reality. Therefore, we must strive to become more aware of how our
minds can distort reality.
Thinking critically (or using sound reasoning) is not as simple as it seems. All of us harbor deeply ingrained
habits that influence our judgment of people, events, situations, and issues.
Belief s
Beliefs are an important part of human life. We all hold prior beliefs about things, people, and ideas, and one
generation passes them on to the next via social learning. Sometimes, we believe what we want to believe despite
evidence against it; we can refer to this as wishful thinking [12] .
So, where do erroneous beliefs come from? Our brains do not intend to deceive us, but knowing the truth is not
always their main concern. Erroneous beliefs are a byproduct of the psychologically adaptive process of social
learning [2] . Social learning supports many useful tasks, such as learning our native language. As social creatures,
we need social cohesion and shared experiences, and we start paying attention to other humans (and potentially
learning from them) as infants [13] . So, it is only natural that we are so open to acquiring ideas directly from
others, especially those we trust.
Second-hand information has great potential to lead to false or distorted beliefs. Humans love to tell good stories,
and the storyteller may highlight certain aspects and ignore others, either to make the story more entertaining or to
emphasize certain parts of it [2] .
In turn, prior beliefs can lead to biased perceptions of people, objects, and events, thereby affecting future
perceptions and experiences. People can then pass these biased beliefs onto others. This may remind you of the
children’s game Telephone or Chinese Whispers, in which one person whispers a verbal message to the next along
a long line. The original message disappears by the end of the game.
Another aspect of our beliefs is that we tend to believe what we want to believe [2] , and this includes our beliefs
about ourselves. We may adopt socially acceptable beliefs to avoid being rejected by others [1] . Like many of our
psychological tendencies, there is nothing wrong with this, but at times it could obstruct our critical thinking.
Emotions
Social emotions such as trust and the desire for acceptance can affect what we believe, but emotions have huge
effects on cognition. Psychologists have documented mood congruent effects in memory and attention [14,15] .
This means that people tend to notice and remember information that fits with their current mood; you may
observe this phenomenon casually in everyday life now that you are looking for it. For example, when somebody
feels joyful, they might notice beautiful scenery or enjoy a good meal more than when they are in a neutral mood.
Our emotions, therefore, influence not only what information goes in but also how our minds process it.
In controlled experiments, a scared or sad person is more likely to perceive others’ faces as threatening or
negative. Someone experiencing a happy, exuberant mood is more likely to label faces as friendly. The first
person might be more likely to recall unpleasant events from their own life, whereas the second would recall more
happy and joyful experiences [14,15] .
This example illustrates that memory retrieval is an active process; your memory is not like a library issuing you
the same memory every time. Instead, the cognitive system reconstructs the memory each time [1] .
Fallacies
The term fallacies often refer to commonly held false beliefs, including some examples of folk wisdom. For
example, many people believe that more babies are born during the full moon [2] . In fact (verifiable, reliable fact,
that is!), no more babies are born on the full moon than during any other phase of the moon.
False belief fallacies can affect our reasoning processes if we assume that pieces of received wisdom are true
without examining them in more detail.
Fallacies also refer to logical fallacies. These are errors of reasoning commonly known as non-sequiturs. To
reason properly, we must make sure that our conclusions follow logically from our arguments’ premises. The
study of logical fallacies has a lengthy history, and there are many of them [1] .
1. Ad Hominem Fallacy
Ad Hominem means "against the person." It means attacking the person rather than attacking their point or
conclusion [1,20] . You might witness this fallacy in a political debate.
For example, one politician argues passionately against a new shopping mall in the town, but their opponent
points out that they live in that town and the new mall would bring a lot of extra noise and traffic to the area. The
opponent argues that the first politician is therefore concerned for themselves, not necessarily for the residents.
Here, the first politician described a concept, but the other proceeded to attack the first as a person, ignoring the
debate’s topic. Attacking the opponent is not an effective way to argue against their idea, so we describe ad
hominem as a fallacy. Like the other factors described here, this fallacy can lead to divergence from important
topics. People sometimes use it deliberately to divert attention and discussion away from certain topics.
There are two types of ad hominem [21] . The circumstantial variety is when a source is speaking hypocritically,
and somebody else points it out. This type of ad hominem may constitute a legitimate argument, but it is still a
logical fallacy. The second variety is abusive ad hominem, where somebody uses another’s personal traits to
attack their idea, where the traits are unrelated to the idea.
In practice, ad hominem rebuttals are not always irrelevant. Let us think about a political debate. One politician
attacks the other’s personality or life choices. But what if these are relevant to the argument?
This example illustrates circumstantial ad hominem: the opponent points out the first politician’s hypocrisy.
Suppose the first politician had no obvious self-interest in canceling the new mall. In that case, the opponent could
still attack them to convince the populace that they were not trustworthy and discredit their opinion. This is
abusive ad hominem, a fallacy we should certainly try to avoid.
2. Hasty Generalization
Hasty generalization is another important fallacy that we need to understand. It means jumping to a conclusion
based on too little evidence. A more technical definition is generalizing from a sample or single instance to a
whole population. However, the sample may be too small or not representative of the general case.
Imagine a friend saying:
“My Grandpa lived to be ninety-six years old, and he drank a bottle of whisky every day of his life! ”
Unfortunately, Grandad does not prove that alcohol is a recipe for a long and healthy life. This anecdote, a single
example, does not outweigh decades of medical evidence.
Generations of thinkers have described this fallacy. Aristotle discussed it first, followed by many more scientists
and philosophers. Alternative names for hasty generalization include faulty generalization, the fallacy of accident,
the fallacy of neglecting qualifications, and many others [22] .
Hasty generalization is easy to commit. People under pressure in busy jobs, seen as authorities on the topic at
hand, might mistakenly conclude too early. Hasty generalization can also lead to wrongly assuming that every
instance is the same, based on one or two examples. It can also lead to people ignoring situations where their
conclusion is false. In the example of Grandpa and his whiskey, the speaker focuses on the single example at the
general case’s expense.
You can see how hasty generalization could become a serious problem and prevent us from getting to the truth.
3. Bandwagon Fallacy
The bandwagon fallacy means falling into the trap of thinking that the majority is always right. People commit
this fallacy when they agree with the majority without seeking further information [23] .
A classic psychological study revealed that many people would agree with the majority opinion even when they
can see that the majority is wrong [24] . This experiment’s task was shockingly simple: participants had to choose
the longest line from a few options, and the lines were different lengths. The experimenters put individual
participants in groups with fake participants, and all the fake ones chose a line other than the longest line.
Asch’s study showed that many people agreed with the majority but then expressed concern and confusion
because the majority gave the wrong answer. The experiment put people into an unnatural situation, but we can
also see the bandwagon effect in real-life scenarios.
In real life, the majority opinion is often fine, and we can choose to follow it without dire consequences [25] . For
example, most people would agree that dogs make good pets and rhinoceros do not. Choosing a pet is a relatively
benign decision, though.
In contrast, turbulent environments lead to more copying; the correct path is harder to discern in more ambiguous
situations [27] . Think about how this relates to a high-pressure business environment, where the situation may be
highly complex, to begin with, and changes rapidly. In these situations, organizations follow each others’ business
decisions more than in a calm and stable business environment [26] .
People and organizations jump on bandwagon for many reasons. They may genuinely believe it is the best option,
or they may see others they admire jumping on the same bandwagon, which gives that choice more credence [26] .
However, the bandwagon effect is a failure to apply logic and our own experience. Information about the
majority’s opinions and choices is easy to obtain and quick to process, but most are not always right. Even the
majority opinion of a group of experts is not always correct.
5. Confirmation Bias
Confirmation bias is a bias towards information that confirms what we think we already know. Take this example:
Jayshree firmly believes that all Hollywood actors over 30 years old have had cosmetic surgery. Every time she
sees somebody whose face looks smoother than last year, she points it out to her friends.
What do you think Jayshree says when she watches a movie and the actors look no different? Nothing, of course.
It is unremarkable that the actors have aged normally. Jayshree notices evidence that supports her belief, but she is
oblivious to the evidence against it.
Confirmation bias is extremely common, affecting what information we notice and what information we seek out
[28] . People have a strong tendency to seek out information that confirms their beliefs due to a strong desire to
maintain those beliefs [12] . Returning to our example, Jayshree might search the internet for ‘celebrity plastic
surgery’ information, but she would not be looking for information on who has not had plastic surgery.
When faced with a message, beware of confirmation bias. It is similar to wishful thinking: sometimes we believe
what we want to believe, and evidence supporting what we believe grabs our attention.
6. Anchoring
Anchoring occurs when we over-rely on the most prominent feature of a situation, person, or object. Anchoring
strongly affects our judgment and estimation [11] . This may be the first piece of information we encountered or
the information that we feel is most important. Anchors are mainly numerical. For example, someone taking out
car finance might choose to focus on the interest rate, displayed in large figures on the website, rather than
processing additional information.
Anchoring biases our judgments, but also things like estimates. If you go to a car showroom, you may have room
to negotiate. Nonetheless, your mind anchors your initial offer around the price quoted on the window. This is
known as anchoring and adjustment: the first number we see biases our subsequent thinking [1,11] .
Psychology experiments show that different anchor points can lead to vastly different decisions. Furthermore, the
anchor does not even need to be related to the question to influence a person’s answer [17, 29] . This shows that
anchoring is pervasive and, to some extent, automatic.
Anchoring is sometimes also known as a heuristic, and it does enable our minds to take a shortcut and stop
processing more information. However, it is sometimes automatic and, at other times, more conscious [11] .
Automatic anchoring is more like a suggestion: the anchor primes somebody’s estimate or choice by activating
similar numbers or ideas in mind, and the person experiencing this may not be aware of it.
On the other hand, deliberate anchoring is when you consciously adjust your initial estimate to get closer to the
real answer. This process is more controlled, but people typically stop adjusting too early, meaning the anchor still
biases their final response. We are more likely to stop adjusting too early if we are under time pressure or are
multi-tasking [11,17] .
7. False Consensus
This bias comes from social psychology, the study of personality and social interaction. False consensus focuses
on how we see ourselves relative to other people. Like the arsonist who might have once said, 'Well, everyone
loves to set fires, don't they?', we overestimate how common our actions or traits are in the general population.
This bias emerges when people hear about other people's responses [30,31] . Whether we read others’ answers to a
set of questions or hear about decisions made in a scenario, we see other people's responses as more common and
typical when they match our own. Conversely, we see others' responses as strange and uncommon when they
diverge from our own.
False consensus effects are larger when the question is more ambiguous. One study asked people specific
questions like ‘are you the eldest child?’ and more general questions like ‘are you competitive?’ The study
reported a much more pronounced false consensus effect with more generic questions [32] . This provides more
evidence for the effect and suggests that when people have more room to interpret the question in their way, they
perceive others as more similar.
8. Halo Effect
The halo effect is not about angels; think about the type of halo you see around a streetlamp in the mist. This bias
occurs when something is seen positively because of an association with something positive, like the light from
the streetlamp spreading out as it refracts through the mist particles. You could call this the ‘glory by association’
bias.
We all know that first impressions matter in our relationships. This bias is part of that. Our initial impressions of
people and things can create a halo, overly influencing what we think of them .
When people have to rate others on positive criteria like competence or intelligence, their ratings are influenced
by how warm and friendly they seem to be [33] . The halo effect even occurs for traits we know are unrelated, such
as height and intelligence.
As you can imagine, the same applies to objects and ideas. Companies like to use beautiful people and scenery in
their adverts and promotions because this gives potential customers a positive impression of the company and the
product.
9. Availability Heuristic
The availability heuristic affects us when we have to judge probability or frequency [12] . We assume things we
can imagine or recall easily are more common or more likely. Another way to conceptualize this is to assume that
the first things we think of are the most important [1,11] .
You can see how the availability heuristic can be useful. When deciding where to take a vacation, your first
thought is more likely to be somewhere you want to visit rather than an obscure destination you have barely heard
of. The desired destination is more available in your memory, as well as more vivid.
This heuristic draws on several characteristics of human memory [17,34,35] . Firstly, the recency effect: we have
better memories for recent events or things we have seen or heard recently. Secondly, we remember things that
make us feel emotional. Finally, we recollect personally relevant and vivid information far better than dry, boring
stuff. Any of these or all of them together can create high availability.
The opposite is also true. If you cannot think of many instances of something, you will think it is less common or
less probable. When researchers asked participants for a very large number of advantages of something, such as
their college course, they found it hard to think of enough. These students rated their course as worse than others
who had to think of fewer advantages [11] .
This example seems paradoxical at first, but not when you think of it in terms of availability. The course’s positive
aspects felt less common to those who were asked for more because they could not think of the full set of
advantages requested. This illustrates how the availability heuristic could be a problem, depending on questioning
techniques.
If we can call examples to mind easily, we think events are more likely to have happened before or to happen
again in the future. For instance, people worry that terrorist attacks are possible or even probable. A young
graduate’s family warns them against moving to New York, Paris, or London because of 'all the terrorists.' These
attacks are readily available to people's minds, so they feel that attacks are more likely than they are.
Availability is a useful heuristic because it allows us to make rapid judgments and decisions. People are more
influenced by availability when they process information quickly and automatically, for example, when feeling
happy or distracted [11] .
Pattern Recognition
Our brains are incredibly good at recognizing patterns. People often perceive faces in facelike configurations, like
the Man in the Moon, known as visual pareidolia. A large area of our visual brain is dedicated to face processing,
so it is not surprising that we perceive them even when they are not there [12] .
Pareidolia is automatic: people do not try to see these patterns; they just do [2] . You have almost certainly had this
experience. Countless internet memes show objects like houses and cars that look like faces. Sometimes it can
take a few moments for the pattern to resolve itself into the image. Still, other times it strikes you straight away,
and it is difficult or impossible to go back to see the image as a less meaningful pattern.
Pareidolia can occur in other senses: hearing Satanic messages in music played backward or ghostly voices in
radio static .
Automatic pattern perception illustrates similar tendencies to optical illusions, like flat images that appear three-
dimensional. These are not just fun and games. Both pattern recognition and false perceptions could lead to false
beliefs, and people can and do seek information to support them.
In summary, our brains are incredibly good at recognizing patterns yet poor at statistics [12] . We regularly
perceive meaning in random stimuli.
This phenomenon explains why a great year for a sports team is more likely to be followed by a worse year than
by another great year. Performance improvements can and do occur, but we cannot judge a single great year as
though it reflected an average improvement. Excellent performance is a combination of baseline ability and
random good luck [12] .
Regression to the mean can have interesting effects in the real world. One scientist worked with military flight
instructors, one of whom reported that when he praised a cadet’s performance, they usually did worse the next
time [16] . The instructor thought that praise made people worse at flying airplanes. However, their particularly
good flight was an outlier, resulting in the cadet regressing the mean on their next performance.
Similarly, extremely poor performance is more likely to be followed by an average performance than by another
dismal one.
Generally, we have low awareness of the regression to the mean effect because we fail to account for chance as
much as perhaps we should. Regression to the mean also feeds into some of the biases and heuristics already
discussed [17] . The next fallacy illustrates a similar point.
The hot hand fallacy is the belief that following good performance; subsequent attempts will also be successful.
Commentary on team sports like basketball sometimes cites this fallacy [2] .
It is related to regression to the mean: regression to the mean is the real situation, whereas the hot hand fallacy is
what people think will happen. It is also related to confirmation bias: people notice when the hot hand effect
happens but do not notice when it does not [2] .
The Hot Hand fallacy can also apply to casino games, which players sometimes perceive as non-random. Casino
players exhibit the opposite fallacy: the gambler's fallacy that because they have lost many times, a win is due [39]
.
These gamblers’ fallacies suggest that even when a random chance is the main factor affecting the outcome,
people persist in perceiving patterns. Imagine what our brains might be doing when it is not so obvious that
chance determines the outcome!
Action Steps
Our brains do a great deal of information processing that we are not always aware of. We are quite fortunate to
have all these short-cuts making processing more efficient. Try these suggested exercises to explore these ideas
further before we move on.
1. Fantastic Fallacies, And Where To Find Them
Find a list of fallacies, biases, and heuristics in an online encyclopedia or psychology website. See how many
there are? Read some of them and make a note of your thoughts. You could look at things like:
Summary
At the beginning of this chapter, the story illustrates that sometimes we get it wrong; sometimes, this applies even
when we exercise good critical thinking skills. Our cognitive processes may be sophisticated, but they are also
economical. In the story, the parents believed they were benefiting their son by playing Mozart because they
believed the high-profile research paper suggesting that Mozart made people more intelligent.
The parents only read the initial research study on the Mozart Effect. They did not follow it up: hasty
generalization. They did not realize that other scientists had found it so hard to replicate the Mozart Effect. They
fell into confirmation bias by only noticing media reports praising (and confirming) the Mozart Effect.
The halo effect may have operated too because Mozart is generally accepted as one of the best classical
composers. If it had been an obscure composer, would the paper have gained such a high profile? The population
found it easy to fall in love with the idea that Mozart's music was special in yet another way.
Nor were the parents skeptical; if they had been, they would have researched the effect for themselves rather than
taking it at face value. Scientists aim to be skeptical at all stages of their workflow, from ideas to analyzing the
data from completed research. The next chapter elucidates scientific skepticism in greater detail.
Takeaways
1. Our minds abound with fallacies, beliefs, emotions, biases, and heuristics, all of which impact our perceptions
and how we process information.
2. These can have massive effects, so we need to remove their effects if we want to reach solid conclusions and
make good decisions.
3. It may not be possible to overcome these biases induced by our minds completely, but critical thinking can
help.
3
ifteen-year-old Alanna Thomas burst into tears and buried her face in her hands.
F “I’m so sorry,” she gasped. She looked up at the police officer standing over her. “I did it, I did… I pushed him
off. I’m sorry...”
On the other side of town, local journalist Lin Rodriguez also buried her head in her hands. She needed to get this
article finished, but the story was so complex. It was hard to know what was real.
Two weeks prior, Mr. Gomez, a science teacher at Mildenhall High School, was floating face-up in a flooded
disused quarry. Lin remembered his classes. He was strict but somehow still inspiring. She would never have
studied forensic sciences at College if it were not for Mr. Gomez.
Not everyone had liked him at high school, but Lin could not imagine why this local academic had fallen to such a
violent death. Events like this did not happen in their small town; the community was in shock. Naturally, the
rumors began as soon as the news broke. Murder? Suicide? Misadventure? Nobody knew, but everybody was
talking about it.
Lin’s boss sent her to the scene as soon as he heard, and she interviewed the forensics team as they painstakingly
collected evidence. They had covered the body, but the lead investigator told Lin that Mr. Gomez had some
suspicious bruises. They found two different sets of footprints around the top of the cliff too.
The next day, further evidence came to light. A local man told police he was walking his dog in the area the
previous night and had heard somebody making their way through the undergrowth not far from the cliff. The area
was overgrown with brambles, and he could hear they were having some difficulty. He reckoned this was not long
after Mr. Gomez had his lethal fall.
Lin asked around to find out who might know more. If it was suicide, perhaps Mr. Gomez had expressed sadness
or pain in the days and weeks before his death. She questioned colleagues at school and heard a few interesting
morsels of information.
Four separate people highlighted the same concern: a small group of students appeared to have a rather nasty
grudge against this particular teacher. They even reported social media threads detailing certain students’ fantasies
about playing nasty tricks on him, like keying his car or even harming him personally. These groups consisted of
students the other teachers agreed were outcasts. One of the students was Alanna Thomas, a shy girl who was a
local attorney’s daughter.
Lin investigated the social media posts and found several distressing threads. Sure enough, ‘let’s kill Mr. Gomez’
came up more than once.
The problem was, Lin just could not believe that any of these disaffected children would murder their teacher.
Priding herself on her skepticism, she looked for and found an alternative explanation.
Alanna Thomas’ father was aiming for a promotion: he wanted to become a district attorney. Furthermore, a
group of powerful local business owners was firmly against this idea. Mr. Thomas was a keen environmentalist,
and everybody expected his appointment to scupper their plans to build a large power plant on the edge of
Mildenhall. Instead of an angry schoolgirl, it was surely more likely that somebody had hired a professional killer
to neutralize Mr. Thomas by implicating his daughter in a murder case.
Besides, the child was the perfect stooge. She was known to hold a grudge against her teacher and be a social
misfit who would crumble under police questioning.
As we have seen, that is exactly what happened. Alanna’s tearful confession formed the backbone of the case
against her. She was easily tall and strong enough to have pushed Mr. Gomez off the cliff while he was out
walking his dog at night, a habit which the whole town knew about.
Lin published her investigation. Following Lin’s article, the police dropped their case due to a lack of evidence.
Officially, they concluded that Alanna’s confession was unreliable and that there was not enough evidence. Mr.
Gomez had fallen into the quarry; it was a terrible accident.
On the same day, Lin received an anonymous email. It said:
“You should have listened. Gomez was murdered.”
The sender attached a high-resolution photograph taken from the top of the cliff. The time and date were exactly
right, and so was the location data. The image showed Alanna Thomas standing at the edge and down below the
body of a man face down in the water.
If Lin had been properly skeptical throughout her investigation, she would not have suffered these dire
consequences. She doubted the first explanation–that Alanna had killed her teacher–so much that she came up
with an even less plausible alternative. She convinced herself and others that it was true, even though her
conspiracy theory had less evidence to support it than the police’s theory. Ultimately, the truth eluded everybody.
Skepticism is not simple cynicism. Skeptics keep an open mind, doubting every explanation rather than believing
they have arrived at a final answer. Taking a skeptical approach based on scientific principles helps us get closer
to true conclusions rather than settling for what we want to be true [1] . Critical thinkers can guard themselves
against being misled into believing lies or mistaken information.
To some people, these ways of thinking may seem incompatible with skepticism. Certainly, the everyday
definition of skepticism focuses more on being critical and challenging ideas than being open to them. A skeptical
person may appear to be closed to new ideas until they obtain further evidence, in contrast to the open-minded
stance portrayed above. But recall that a key part of skepticism is being open to changing your position and seeing
other perspectives.
Opponents of a skeptical approach may argue that skepticism is paradoxical because skepticism itself is a belief
system . These opponents argue that skeptics use ad hominem, straw men, and similar techniques to discredit
potential miracle discoveries [50,52] . However, true skeptics take a balanced view and are always open to the idea
they might be wrong. ‘Pseudo-skeptics’ are those people who almost exclusively disbelieve and deny claims; they
are similar to ‘debunkers’ whose mission is to try to disprove claims [52] .
To summarize, think of skepticism and open-mindedness as two complementary aspects of the same process:
critical thinking. They are not incompatible. Note that an open-minded attitude allows you to salvage the good
parts of bad ideas, whereas a strict skeptic would throw everything out. Open-mindedness is a key part of
creativity and innovation.
Lucidity And Metacognition
Lucidity is an open-minded state where we can see past our prior beliefs and perceive reality as it is. This is a
great state to aim for if you want to appreciate new information fully.
People have an inbuilt immunity to new ideas and prefer to stick with what they know. People may even perceive
new ideas as threatening, making a kind of automatic assumption that they already believe must be better than the
novel claim. That is why discovering the true cause or process of something is only the first step. Scientists must
then continue to investigate and try to convince others that their theory is correct [53] . At the same time, they must
maintain a skeptical viewpoint, acknowledging that they might be wrong.
People have trouble with new ideas, particularly new scientific ideas, for a few reasons. Firstly, the true causes of
phenomena are not usually simple or obvious. Secondly, people often get the causes wrong when considering
things they feel strongly about. Thirdly, it is difficult to discover how to get to the correct explanations (that is
why science is always seeking to improve scientific methods and knowledge). Fourthly, people need to be
continuously motivated to discover the real causes and, subsequently, to promote novel explanations [53] . You can
see why scientists are such busy people.
Evidence shows that high critical thinking skills are associated with high metacognitive awareness [54] .
Metacognitive awareness means having awareness and control over how you process information; it is a self-
reflective process, known as ‘thinking about thinking’ [55] .
We can use metacognition to guide our own learning, development, and decision-making. People with high
metacognitive awareness have excellent knowledge about their own cognitive processes and their outcomes. Like
critical thinking, metacognition is teachable and can improve with practice [56] . For example:
John knows he has a poor prospective memory - he always forgets to do things he has said he will do. His family
and colleagues often get irate about this. However, John has excellent metacognitive skills: he knows his memory
is poor, enabling him to do something about it. He trains his memory by setting reminders and writing task lists.
After a while, he does not need to set reminders anymore; he goes straight to the lists.
You can imagine that somebody with poor metacognitive skills might not have been as successful. John was not
afraid to admit he had a minor memory problem and was able to solve it .
Interestingly, student teachers with more experience showed higher metacognitive awareness and critical thinking
skills (assessed by questionnaire) [54] . This was a correlation, so we do not know whether metacognition causes
critical thinking or the other way round; alternatively, they may draw on the same underlying skills and habits.
Since critical thinking means deliberately using sophisticated thinking skills to solve problems, going beyond
intuition, and using high-level analytical skills, it seems reasonable to suppose that it relates to metacognition.
Paul and Elder [57] describe nine intellectual standards that should help us think both lucidly and metacognitively
about ideas. These are standards that scientists strive to meet in their communications, and they give you a helpful
framework whether you are composing an argument or receiving one from another source:
Clarity : to reason about a claim, we must be clear about what they mean. Therefore, when you are
communicating, you need to aim for maximum clarity as well. This standard is a prerequisite for all the other
standards.
Accuracy : you may not have access to resources to check the accuracy of all points made, but you can assess it by
thinking about whether the claim is verifiable and whether the source is trustworthy.
Precision : information should be appropriately precise for the point under discussion. A claim could be accurate
but imprecise; for example, ‘the company’s profits fell last year’ is less precise than saying they fell by 18% last
financial year.
Relevance : we might reason clearly, accurately, and precisely, but this is pointless if we deviate from the core
topic.
Depth : this means dealing with the complexities and relationships of the concept under discussion rather than
over-simplifying it.
Breadth : this means taking in multiple (relevant) points of view and recognizing alternative perspectives on the
issue. For example, business strategies often look at environmental, ethical, and social concerns, as well as
economic factors
Logic : this means ensuring that the arguments work logically: does the evidence lead to the conclusion, and does
the argument have internal consistency?
Significance : this is related to relevance, but sometimes relevant points are trivial. We need to ensure that our
reasoning focuses on the important aspects of the problem.
Fairness : our reasoning should be bias-free and honest. We should aim not to argue only for our own interests.
Others may interpret unfair arguments as attempts to manipulate and deceive them.
Hopefully, you can see how these standards relate to scientific skepticism and communication. All of these
standards apply to science but also our everyday lives, both work-related and personal problems. Therefore they
are useful to remember when composing or reading claims and other communications.
Action Steps
We have examined scientific skepticism in detail, with the aim of helping us get to the truth. Why not have a go at
these optional exercises and apply some of the ideas we have discussed?
1. Opening The Mind
Write a skeptical and open-minded proposition or theory of your own. It may be helpful to use something trivial
for this practice exercise. It can be as simple as ‘Why I should get my driveway resurfaced this summer,’ or ‘An
explanation of why I choose not to dye my hair.’ Use the following helpful habits of mind [2] :
a. Gather as much evidence as possible. For instance, what is the current state of your driveway, and what are the
risks of not getting it resurfaced?
b. Beware of false positives and false negatives in the evidence. For example, you might read that driveway
surfaces typically fail after five years, but check who wrote this and what they base it on, and see what other
sources say.
c. Think broadly: consider everything that might possibly impact the proposal or theory. This might include
personal finances, the broader economy, environmental concerns - whatever factors are most relevant to your
proposal.
d. Consider what somebody with the opposite opinion to yours would write: how they would explain it and/or
what they might decide. This will help you maintain an objective perspective .
2. Metacognition Exercise
It is normal and natural to be resistant to changing our minds, but we learned here that reflecting on our own
cognitive habits can help enhance them. Use this quick questionnaire as a self-reflection exercise, or rate
somebody who you know well. Adapted from Snelson [53] .
a. How would you rate your ability to accept any new minor idea with a lot of evidence to support it?
I accept it immediately
I accept it after doing my own research
I need a few weeks to absorb the new idea
I do not change my mind over minor ideas
b. How would you rate your ability to accept any new major idea with a lot of evidence to support it?
I accept it immediately
I accept it after doing my own research
I need a few weeks to absorb the new idea
I do not change my mind over major ideas
c. How would you rate your ability to accept any new revolutionary idea with a lot of evidence to support it?
I accept it immediately
I accept it after doing my own research
I need a few weeks to absorb the new idea
I do not change my mind over revolutionary ideas
3. Standard Process
Analyze an article to check whether it meets the intellectual standards suggested by Paul & Elder [57] . Choose
something like an editorial discussing a controversial topic. Is it:
Clear?
Accurate?
Precise?
Relevant?
Deep?
Broad?
Logical?
Significant?
Fair?
Summary
The story that began this chapter showed us that people reach faulty conclusions even when they try to keep an
open mind and discover the truth: the police thought they had solved the crime, and Lin thought she had found a
better explanation. They were both wrong.
With a truly skeptical attitude, somebody would have doubted both explanations, put them to one side, and
investigated further. They would have been open to alternative explanations and would not have been averse to
changing their mind even once they thought they had the correct answer.
Scientific skepticism is not easy. It takes vigilance and discipline to learn, but like critical thinking and other skills
that we discuss here, you can hone your skills. The processes can become more automatic and less effortful as you
develop your expertise.
Next, we will look at how to deal with claims you see in the media. That includes social media, so it should be a
great way to practice your skeptical attitude!
Takeaways
1. When assessing claims, act like a scientist: see whether the claim is verifiable and falsifiable. If not, perhaps
somebody is asking you to believe something without sufficient reason.
2. When making decisions and forming conclusions, keep a balance between skepticism and open-mindedness.
3. To reach the truth, aim for lucidity. Sweep your preconceptions out of the way and experience the world as it
really is, without your previous experience blinkering you to new facts and evidence.
4. Keep the postmodernist view in mind: perhaps we can never know the truth, and perhaps meaning is
completely relative. If that is the case, many things are possible.
4
ion Davis was sitting in his corner office early on a Friday afternoon, signing off business expenses. So far, so
Z normal. He had had a long, busy week, so the routine task appealed to him. As a manager, he was generally
well-liked, not least because his civil engineering background gave him credibility with the office staff.
As he was nearing the end of the task, an email notification pinged onto his screen: “Interesting read.” The
message was from Alaistair, a junior team member he worked closely with, so he decided to take a look. Almost
immediately, he wished he had left it until Monday.
The email contained an attachment: the environmental report he had been waiting for. He proposed ‘rewilding’ a
section of the development, alongside the approved construction of a visitor center and venue for various outdoor
sports activities. Alastair included a web link which he said Zion should take a look at.
Zion clicked through. He scrolled through lengthy, emotive paragraphs about the ‘failure’ of corporate
environmental endeavors like theirs. He sighed. This was going to take the weekend to sort out.
Later, Zion strolled into the open-plan office Alastair and the other juniors shared. Immediately, his team
bombarded him with questions:
“How can the company justify this?”
“This is outrageous! I can’t believe our company would do this!”
A few people argued back in favor of the company.
The conflict was not restricted to his immediate team, though: the MD, Milton Skelpie phoned Zion, ordering him
to call him in private immediately.
“I thought we were clear. This project is over 80% ecological work specifically to support wildlife, so why are the
environmentalists up in arms about it? And why are half of your team on their side?”
“It’s this article, Sir. They’re saying that we should leave nature to take over by itself and that anything we do will
make it worse. The article’s bogus, Sir-”
The MD cut him off.
“Sort this out, Zion. I’m relying on you. ”
Milton hung up. Zion took a deep breath, stood tall, and re-entered the shared office. He looked around at his
team.
“Everybody relax. We can tackle this together. So, we have some problems. I’ve read this article suggesting that
our wildlife park project will harm the local environment up in Washbrook, and my boss told me that several of
you have been posting about it on social media today. I’m going to ignore that because we have bigger problems.
As I said, we can all work together to solve this.”
The next few days were some of the most hectic and challenging Zion had ever experienced. He delegated
research tasks to several different staff members. He asked them to research the article, examining the platform
that had published it, who had written it, where they got their information from, articles that cited this one, the
whole gamut. He received a diverse set of reports.
Two of the staff, Meredith and Marco, had picked up one interesting fact: the article used some of the same
phrases (and misspellings) as a blog post published towards the start of the project. A fringe group wrote the blog,
and their main purpose seemed to be to block any kind of development. Members encouraged each other to lie
and exaggerate to get their point across; they planted misinformation to stoke readers’ emotions and make them
angry .
Zion was ready to present his findings to his superiors when the MD visited in person, completely out of the blue.
“Zion, this is serious. The local county council is now concerned that we misled them in our planning application.
Local residents are protesting at the site, stopping construction vehicles from entering and chaining themselves to
trees. We’ve already lost thousands because of the delays to this project.”
Zion called Meredith and Marco into his meeting with the MD.
“Sir, I would like to introduce you to the only two people in this office who picked up that this article is
hogwash.”
They spent the next hour presenting their comprehensive, detailed findings of the article that had caused so much
trouble: the source was not credible; the story was a distorted mishmash of second-hand information and opinions;
it played on people’s emotions; it misrepresented the science. As Zion said, it was hogwash.
Happily, this convinced the MD. He was so impressed that he assigned Zion and his team to write a well-
researched article on the topic for the company’s website.
After a difficult week, the project was back on track, and Zion had gained even more respect from his team and
his superiors than he could ever have expected. Still, the fake news article had almost caused a catastrophe .
Have you ever had an experience like this? Perhaps, or perhaps not. The point of the story is that a lack of media
literacy can have huge potential consequences.
Several of Zion’s staff believed what they read without investigating where the story came from; they failed to
seek further information, which led to conflict. It could even have harmed the business. Zion’s quick, decisive
action averted a potential crisis. Even better, he used his critical thinking skills to produce a report and web article
discrediting the disparaging claims.
Also, let us not forget the councilors and local residents who also fell for the disinformation, and the protestors
who would have been ashamed when they realized they had disrupted something that fit with their values, rather
than opposed them. Surely they would rather have spent their time and energy protesting against something
worthwhile?
Action Steps
1. Media Literacy Practice
Perform a general web search for a topic of interest and assess two of the resulting articles or webpages using
CRAAP and lateral reading. Notice whether the two approaches give you different impressions of the sites,
perhaps even leading to different conclusions.
2. Deep Div e
Choose a news story that interests you; perhaps it relates to your business or personal concerns. It could be one
that you found during Action Step 1. It should be sufficiently complex and mainstream for you to find at least four
different sources. Research the information these sources report, aiming to be as diverse as you can. For example,
look for left and right-wing sources from both the mass and social media. Chart on a piece of paper what they
agree on and what they disagree on. Can you see different 'facts' reported? What about word choices indicating
bias? You can repeat this exercise in the future if you want to assess another news story in depth.
Summary
This chapter’s story showed how fake news concocted by extremists snowballed and nearly spelled disaster for a
company and a community. This fictional story’s message was serious: many real-world fake news stories have
had terrible consequences. The few examples given here should give you an idea.
The mass media is now immune to getting things wrong. Still, even journalistic outlets vary in the quality
standards they set for themselves, so it is important to apply your critical thinking skills here too. Three of the
characters in the story displayed great analytical skills in picking apart the mess of blogs and social media posts
that led to the misinformation problem. They presented their findings rationally and calmly that defused the
situation. In the end, this positive outcome could have even enhanced the company’s reputation.
Next, we move on to look at how others try to deceive us to our faces and how we can sort the truth from the lies
in these everyday situations.
Takeaways
1. To separate sense from nonsense in mass media and social media, we need to apply the rules of logic and use
our own expertise.
2. We need to be alert to fake news, which is deliberately concocted to fool people, and not confuse it with real
news, satire, editorial opinion, propaganda, or advertisement [69] .
3. Take a skeptical approach even if the story feels true, and beware of ‘news’ that seems too extreme to be true.
5
fter reading the fine print, Alicia decided she was happy with the terms of the business loan.
A She had recently met Aaron Lowen, a business development consultant from Howsville, the next town along
from hers. He strolled into her ice-cream parlor and quickly persuaded her to open another cafe in Howsville. She
refused at first: she liked running a single site business, and her customers found it charming to buy ice cream
from a family-run concern. The expansion was too risky.
However, Aaron insisted.
“They don’t even have real gelato in Howsville! With this artisan Italian ice-cream, you’ll make a fortune! I
promise you, there are no decent ice-cream cafes at all.”
A smile flitted across Aaron’s face. Quickly, he looked serious again .
“Is this really a good opportunity?” Alicia asked.
“Yes, definitely,” Aaron grinned.
Alicia noticed a strange wobble of the head, but thought no more of it.
So here she was: opening a new café. Once the loan was in place, it was all hands on deck.
However, Aaron had not been a hundred percent truthful. A local gourmet ice-cream company was running trucks
and pop-up cafes across town, and they had no qualms about targeting her new store. Sometimes the local kids
would even come in to criticize her product:
“Not as good as Toni’s.”
The trouble at the new branch rapidly damaged the entire business. It seemed time to cut their losses. Then,
vandals broke into the new café. They wrecked the displays and littered ice-cream and toppings everywhere.
Alicia closed for the day, and her employees cleaned up while she called the police and the insurance company.
This was almost the end of the whole company, but Alicia smiled and kept going. Her sister-in-law gifted her
some of the profit from her own business, which kept Alicia afloat for a while. Sadly, the new café was still not
viable, so Alicia decided to close down.
On the last day, they organized an ‘everything must go’ event, with half-price ice-creams for all the local high
school and college kids. Late in the afternoon, this turned into free ice-creams for all.
Alicia confided in a middle-aged lady who was enjoying a cookie and cream cone. The lady was sympathetic:
“It’s very sad, but Aaron from Toni’s has such a good grasp of the local business environment and so many
friends and contacts in the town. You were brave to compete with him.”
“Aaron who?”
“Aaron Lowen, our local entrepreneur. He’s involved in most of the businesses in town, and even wants to open
up in your town as well. Can I get some white chocolate sprinkles with this?”
In a flat tone, Alicia directed her to ask at the counter. So Aaron had lied.
Finally, she had found the missing piece of the puzzle: Aaron was deliberately trying to put her out of business,
and it had almost worked. He had almost cost her everything. If Aaron had succeeded, he would have been the
number one ice cream seller in both towns!
She had to applaud his audacity: pop-up ice cream cafes and trucks, rather than fixed premises, meant she had not
discovered that there was already a popular artisan ice cream maker in Howsville. So she was back to her initial
position, but it could have been much worse .
A few months later, things had improved. Sympathetic locals who heard about the diabolical deception flocked to
Alicia’s home town cafe. It was a warm spring, so she added two bicycle-based ice cream sellers. All this led to
record sales, as well as bad publicity for her rival Aaron.
Alicia was intelligent and successful, but she missed the signs of deception. Aaron gave away some clues: the
quick smile that flitted across his face when he claimed there were no ice-cream cafes in Howsville, and the head
wobble when he confirmed that it was a good opportunity, betrayed his real opinions. He promised something that
sounded too good to be true, and he appeared trustworthy, using his expertise as a business consultant to add
credence to his claims.
Alicia noticed these clues but did not know how to interpret them. She did not know that even accomplished liars
reveal themselves occasionally, as the human body and face express our emotions even when we work hard to
suppress them.
Most people are basically honest, but one deliberate deception could potentially cost us a lot. Therefore, as well as
examining claims and evidence in detail, being skeptical about ideas, and examining evidence, we need to look at
other clues that can tell us if somebody is lying, whether it is unintentional or deliberate on their part .
There is no single clue that tells us somebody is lying. Instead, we must draw tentative conclusions based on as
much evidence as we can find [80] .
We can apply critical thinking to the content of what people say, and when we interact face to face, there are
several additional sources that can give us clues as to whether somebody may be lying. Liars can accidentally
reveal the truth by leaking information or emotions they are trying to hide. A few behaviors might clue us in (but
note that these behaviors rarely reveal the content of the lie).
Liars sometimes work hard to conceal a lot of emotion, and we can detect this cover-up by gathering evidence
from their faces, bodies, and voices, for instance, if somebody seems panicky. People who tell the truth expect
others to believe them, so they appear more relaxed [81] .
The words used provide the first set of clues. Three ways that somebody’s words can suggest deception are:
making errors in repeated facts; slips of the tongue (stating the real fact or situation by mistake); and saying far
too much. In the latter case, you might identify a liar because their explanation is overly elaborate and detailed,
suggesting they are trying desperately to convince you.
However, liars focus on faking their words and facial movements, whereas their voice and bodily movements are
less easy to falsify [79] . Scientists have studied interpersonal signals from faces, voices, and body language
extensively inlie detection.
Many people believe that the eyes give away true feelings in terms of interpersonal signs, and liars often
deliberately try to appear truthful using the eyes, such as making plenty of eye contact. However, the impression
conveyed by eye gaze differs across cultures. Some regard direct gaze as disrespectful, which may affect
suspicion of guilt when police officers arrest or question people of different ethnicities [80] . Because eye gaze is
so deliberate, it is perhaps a poor indicator of somebody's inner feelings.
Blinking is more spontaneous than eye gaze, and pupil dilation is not under conscious control. Therefore these
provide more reliable signals of genuine feelings. Changes in frequency of blinking and wide pupils could signal
emotional arousal associated with deception, but this is inconclusive since they are signs of general emotional
arousal. Pupil size was the best indicator of lying in a meta-analysis comparing various signs of tension in liars
[81] . Additional bodily signs of tension include sweating, pale face, and flushing, but these are general to
Nobody Is Immune
Everybody is susceptible to believing lies and half-truths [79,81] . Even the experts get it wrong sometimes.
People are overconfident in detecting lies from the face and voice, which acts as a barrier to finding the truth.
Faces, in particular, are used for communication and expression, so any expressions read from the face need not
reflect how somebody really feels. Further, communication via the face varies in different cultures and also among
individuals. We cannot extract any solid rules for detecting a liar from their face or voice with so much variability,
although they do provide some clues. We would be better off using evidence and trying to establish the facts [86] .
Interestingly, one study suggests that trained police officers may be no better or worse at detecting lies regardless
of whetherthey focus on the content of the lies, the person’s face and voice, or their body language. Their
accuracy was 50/50. They may as well have flipped a coin. This illustrates that even trained professionals might
be fairly poor at working out when somebody is lying, despite high confidence. A similar study showed that
practice in itself improved people’s performance, but instructions to attend to certain cues (face, voice, and body
language) had no effect [84] .
One reason people get it wrong is that they assume that others are honest [68,79,86] : we believe others by default.
Most of us tell the truth most of the time, so the bias towards belief normally leads to correct conclusions and
better cooperation. We rarely question others’ honesty unless something makes us suspicious; however, there is
evidence that many people are poor liars even when they attempt to deceive [86] .
Evidence suggests that lying is not particularly common. A study that asked people how many lies they told over
a day showed that most people reported no lies at all, the vast majority told one or two lies, and a small number of
‘prolific liars’ told half of all lies reported. You might suspect that people lied to the researchers, but that would
not explain the distribution of the responses. Further, hundreds of people participated in three separate
experiments, adding credence to the conclusions [87] .
When people reflect on how they have detected lies successfully in the past, their answers point towards two
strategies: comparing the lie to the available evidence and persuading the deceiver to confess. Obtaining evidence
relies on getting contextual information around the lie, so you are likely to be better at detecting lies within your
own domain of expertise. Surmising that somebody has a motive to lie raises detection accuracy to almost 100%,
and it is useful to use probing questions. Again, experts are better at this [88] .
If you suspect somebody is lying to you, encourage them to talk. Get the person to repeat their story and listen out
for factual errors and inconsistency [80] .
We all need to be aware of our own motivations, emotions, and preconceptions and do our best to avoid letting
these color our perceptions of others. Overall, it is difficult to decode when somebody is lying to us. Luckily, in
the case of our social relationships, minor lies are often inconsequential or even positive. However, modern life is
full of scams and other deceptions, which could be potentially very damaging .
Outside of straightforward scams, real businesses and organizations sometimes engage in trading practices that
may be illegal or at least dodgy [93] , for example:
Fake reviews and testimonials : these are common on online marketplaces. Sometimes celebrity testimonials are
used, and it is difficult to tell whether the celebrity gave their permission.
Unfounded predictions and promises : this may be illegal if the company knew a specific claim was untrue, but
fanciful advertising claims are usually allowed.
Bait advertising : this is when a company advertises a product for sale, but does not have a 'reasonable supply.'
The bait product lures people in; then the seller persuades them to buy something else.
Misleading guarantees, conditions, or warranties : for example, a seller cannot make you take an extended
warranty, but the salesperson might try to imply this; this con relies on customers not knowing the details of the
business' legal obligations.
With so many companies and individuals trying to make money from us, it is sensible to keep in mind that if
something seems too good to be true, it probably is. However, it would be cynical and destructive to apply this
attitude to our everyday interactions and relationships. Remember, there are only a few prolific liars around, and
they are probably busy running online scams.
Action Steps
Now that we have looked at how to use critical thinking and evidence to spot lies and deception in everyday life,
it is time to apply some of this knowledge. Try the following action steps.
1. The Lying Game
Play a game of lie detection with somebody close to you. Each of you can prepare a handful of lies and truths that
you will try to convince the other person are true. Remember this is a fun learning exercise, so use humorous or
innocuous facts about yourselves that the other person does not necessarily know. Use some of the techniques
covered in the chapter to convince them and try to detect the lies correctly, and have a conversation afterward
about how it went.
2. Proof Of Lie s
Try some of the techniques for spotting a liar. Find an online video from a few years ago of somebody you know
is lying because someone else exposed them or they confessed. This could be from politics, an interview with a
public figure, or a televised court case. Watch the video in slow motion and look out for some of the signals we
have examined in this chapter:
You could then do the same but listen for any acoustic signals, such as raised pitch and frequent hesitation,
perhaps comparing their verbal behavior to an example when you know they are not lying.
Summary
In the story at the start of this chapter, it turned out that the business consultant had seriously misled the business
owner: the rival company was a serious threat to her business’ expansion after all. How could she have picked up
on this?
Unfortunately, there is no surefire way to tell if somebody is deceiving you, especially if it is somebody you do
not know well. However, Alicia could have checked the facts: did the other neighborhood have real gelato? Was
the promise that there were no decent ice cream cafes in that town too good to be true? The deceiver also showed
a possible micro-expression (a fleeting smile at an odd time) and an emblem gesture when we slightly shook his
head, possibly revealing that he was saying the opposite of the truth. She might have been able to figure it out, but
perhaps assumed that this man was telling the truth because most people are honest.
In the next chapter, we will explore what some people might call a special category of scam. We look at
pseudoscience and how to distinguish it from real science and technology.
Takeaways
1. Tune into the visible and audible signs of potential deception: you can learn them through careful observation
and practice. However, you need to apply critical thinking to what they say and pair this with a keen observation
to get closer to the truth.
2. There is no sure-fire way to detect lies, but knowing the person or establishing a baseline will help. Even a host
of behavioral clues cannot prove that somebody is lying.
3. People believe others by default, and research suggests this is warranted as most people are honest.
4. Selling products and ideas is perhaps the exception; scams and frauds are sadly very common, but you can
detect them and overcome them using a skeptical, analytical approach.
6
hen Marlon’s Mom moved to her retirement apartment, he noticed something he found strange. The
W apartment was spotlessly clean, but they found the same small object in every corner of every room and
window recess.
Marlon assumed that the previous resident must have gone crackers. He or she had stashed a horse chestnut in
every corner they could find. The removal men carried on moving his mom’s possessions in, whistling happily as
they wedged the large couch into the small living room. Marlon heard a tiny wood-like object roll along the floor
underneath the couch.
“Excuse me, guys,” he said. “I don’t think Mom wants those chestnuts everywhere. Can you put them in the trash,
please? ”
The two assistants put down an oak dresser and looked to their foreman for guidance, but Marlon’s Mom
interjected before he could say a word.
“They’re fine, gentlemen. Please carry on,” she said to the removal men, giving Marlon a pointed look.
As the removal men carried on, Marlon looked to his Mom in confusion.
“Isn’t it odd that they just left these chestnuts everywhere? Why don’t you want them thrown out?” he asked.
His Mom gave him a superior look.
“They keep the bugs away, Marlon. It’s a tried and tested natural remedy. I would have thought you would
approve.”
Marlon could not help but burst out laughing, but his mother was clearly serious.
“Proven? Who proved it?” he asked once he had his breath back.
“Not your new-fashioned scientists. Housewives have known about it forever. Spiders are scared of the fumes
they give off or something like that. My Grandmother taught my mother, and my mother taught me. Did you ever
see a spider in my house? I thought not.”
Marlon was sensible enough to mumble in agreement and then drop the conversation. He had to admit he had
never seen a spider in his Mom’s house, but she spent an awful lot of time dusting .
In fact, science has found no evidence for Marlon’s Mom’s belief that horse chestnuts deter spiders [94] . This
particular erroneous belief is benign, but it illustrates the point that sometimes people simply believe in received
wisdom. Marlon’s mother believed her home was spider-free because of the chestnuts, exhibiting confirmation
bias. Still, as Marlon’s inner voice hinted to him, the lack of spiders was more likely due to her constant cleaning.
You might conclude that the mother’s belief in the chestnut deterrent was a harmless superstition, but are all
superstitions harmless? Where it gets more debatable is the case of pseudosciences. These are more complex and
far-reaching than superstitions; they involve entire belief systems.
A pseudoscience is a collection of beliefs or practices that people mistakenly regard as scientific. Sciences
challenge their own claims and look for evidence that might prove these claims false through systematic
observation and experiment. In contrast, pseudosciences aim to look for evidence that supports their claims,
seeking confirmation rather than falsification.
What Is Pseudoscience?
Now that we have a clear definition of what science is and its methods, we need to define pseudoscience. As the
prefix ‘pseudo’ implies, pseudoscience refers to beliefs and activities that might resemble science but are not
science [12]. We call them ‘not science’ because they diverge from mainstream scientific theory, and in some
cases, scientific methods cannot test them either [97].
The line between science and pseudoscience is not always clear, though. Investigators working in pseudoscience
are free to employ hypothesis-testing and scientific techniques to examine evidence and conclusions. But they
sometimes commit mistakes and produce misinformation in the process, and end up presenting incorrect
conclusions.
Examples of pseudoscience :
Alternative medicine : alternative therapists sometimes fail to specify how the therapy works or make general
references to things like the energy that the practitioner harnesses or directs into the client's body. It would be
difficult to devise an adequate control situation to compare to these therapies. Pseudoscientific therapies often rely
on hearsay rather than clinical trials, and this can be subject to confirmation bias and the hasty generalization
fallacy [1] .
Psychic powers : many people across the world believe in supernatural powers like extra-sensory perception and
clairvoyance. Believers and scientists alike find it difficult to test these ideas, and although many have tried, the
evidence is inconclusive [1] .
Astrology : predicting people’s personality traits and future events from the position of the stars, the Moon, and
planets is another ancient practice that appeals to people across the world. The predictions are vague and often not
falsifiable, and therefore have not been tested in a rigorous way like scientific theories [98] . Investigators have
found no correlation between star signs and personality traits [99] .
We should not confuse folk remedies and young sciences with pseudosciences. Be skeptical of ancient traditions:
they might work and might not, but age alone does not imply efficacy [95] . We should also be open-minded about
young sciences while establishing their methods and world views, although further scientific investigation may
falsify them. One example is germ theory, which the scientific establishment thought was implausible at first, but
further investigations confirmed that microbes, not foul air, caused diseases [61] .
People and communities hold biases, but so do scientists. The history of science shows that socio-cultural contexts
affect how scientists work, despite their drive to be bias-free. For example, the mistaken idea that the surface of
the human brain looked like the gut influenced early scientific drawings of the brain, even though the artist could
see a real brain in front of them [95] .
Action Steps
1. Detective Work
Make a brief list of possible pseudosciences and use your skills to gather evidence and decide whether you think
they are real science or pseudoscience. If you need ideas, choose a couple of these:
Iridology
Mycology
Homeopathy
Neurochemistry
Geomorpholog
Macrobiotics
2. Study Skills
Devise a scientific theory within your field of expertise, and plan an investigation. This could be something work-
related, within a leisure pastime (such as sports or creative work), or something silly and fun. Whatever you
choose, aim to be thorough. It is fine if you cannot conduct the study for real. For example, if it involves time,
resources, or ethical issues.
Work through the general scientific method to hone your idea and generate something you can test. Make casual
observations, formulate a research question, narrow this to a testable hypothesis, and consider how you would
analyze the data. If you are not a statistical expert, never fear - you can always draw a graph and compare the data
visually.
Finally, consider what valid conclusions you could draw from different results. Congratulations, you have just
proved you are not a pseudoscientist!
Summary
In the anecdote at the start of the chapter, we met Marlon, who was confused by his Mom’s insistence on keeping
horse chestnuts in the corners of her apartment. She said this was a well-known way of keeping spiders out of the
home, but she could not explain why horse chestnuts put them off. This vague explanation is similar to
pseudoscience: people might believe something works, but they do not know why.
Marlon’s Mom believed the practice worked because it was traditional, and she also exhibited confirmation bias.
Even Marlon succumbed to it slightly when he reflected that he had never seen a spider in his Mom’s house.
However, less reliable than objective evidence.
A scientific approach to any idea requires observation, followed by defining a solid research question that you can
test in the real world. This kind of study does not always get done for pseudoscience. In many instances, it cannot
be done because there is no adequate control condition to compare to pseudoscientific practice. Overall, science
and pseudoscience alike provide us with ample opportunities to exercise our critical faculties.
Takeaways
1. Scientific methods and processes are the most reliable ways to explore and find out about the world.
2. However, not everything that resembles science is actual science. Mistakes and misrepresentations in the form
of pseudoscience can tempt people towards incorrect conclusions .
3. Pseudosciences persist for many reasons, including inherent biases, wishful thinking, tradition, and certain
personality traits.
4. Keep an open mind about novel ideas, but remember that some ideas are more useful than others because they
help us understand and predict the world.
AFTERWORD
Marvin lazed on the decking at his lake house, watching the fish whirling around in the clear water. His work
phone vibrated on the kitchen counter, but he let it ring. He knew the call spelled no good for his summer retreat.
Hours later, the evening drew in, and Marvin finally got around to checking his missed calls. He was surprised to
see that his bank had called him and left a message. That was unexpected. He managed to call them on the
number they left and got through to an operator straight away. The line was terrible, but the voice on the other end
sounded urgent.
“I can’t hear you,” said Marvin.
“Give me your password, Mr. Keller,” the crackly voice said.
“Of course…” Marvin duly gave the operator his password and further security details .
Apparently, there was a problem with his account, which meant he had to wire his money to a different account
urgently.
Did you guess? Scammers targeted Marvin and managed to get him to transfer all his funds to them. He did not
even notice until he returned from his lake house to double trouble: his work colleagues had realized he had
ripped them off, and he realized he had gained nothing because he had fallen for a telephone banking scam.
In this example, the protagonist was lax about checking the credentials of the person calling him. The signs were
there, particularly the unscheduled contact and urgency of transferring the funds. His lack of skepticism about the
call ended up costing him a lot.
Separating sense from nonsense is a massively difficult task, not least because potential deceptions bombard us all
the time, almost as if they were waiting in line for us to drop our guards. However, we can get closer to the truth
by applying critical thinking techniques to information we encounter each day. In summary:
Critical thinking approach : this means reasoning logically, using evidence rather than working to justify
conclusions we desire. Gathering information to argue for a predetermined conclusion is easy but wrong. With
critical thinking, we can be sure that our decisions are conscious, and deliberate and based on facts. We must be
clear about the difference between facts, opinions, and claims. We must know about the role emotions play in
human cognition. Lastly, we must seek evidence relating to purported facts, including researching the source of
and reason for any message.
Our complex minds : how our brains work can lead to blurred boundaries between truth and non-truth, or even
getting things completely wrong without even being aware of it. Humans are emotional creatures with a drive to
learn from and believe others, so, unsurprisingly, misinformation spreads. Furthermore, biases, fallacies, and
heuristics all have a significant influence on our thinking, sometimes without us ever becoming aware of it.
Scientific skepticism : this is an attitude that can help gauge the truth of claims. Be like a scientist and question
whether a claim you hear can be verified or falsified. Scientific skepticism means overcoming our natural
inclination to process information quickly and automatically, and instead stepping back, slowing down, and really
analyzing what we encounter. Skepticism means doubt, not necessarily disbelief, and it works best with an open-
minded outlook.
The media : social media and the mass media are the major sources of information for the vast majority these
days, but they vary in reporting accuracy. Some information can even be completely false, designed to lure people
in to spend money and/or time on websites run by shysters. Use media literacy techniques like lateral reading to
get a deeper understanding of the information you see in the media, rather than taking it at face value .
Deception : dishonesty is fairly widespread outside of the media, too. Most people are honest enough about the
things that matter, but we would all be wise to stay alert for the signs that people are lying to us. Faces, voices,
and body language all provide clues, but we should pay attention to what they say as well. Similarly, be alert to
the signs of fraudsters using scams like advance payment schemes.
Pseudosciences : are explanations or techniques that claim a scientific basis or approach, but they are distinct
from sciences in several ways. Science uses a cycle of observation, testing, and refinement of theories and
methods, aiming to advance knowledge in a specific area. In contrast, pseudosciences are sometimes difficult to
test in a truly scientific manner. However, cynics sometimes mislabel progressive science as pseudoscience, so we
should do our best to assess new ideas in an open-minded and skeptical manner.
In conclusion, now that you have the tools required to separate fact from fiction, make sure to do your critical
thinking as well as you can and work to develop it. Critical thinking helps you recognize and avoid harmful and
useless thought patterns. It helps you to reach better conclusions. It improves the quality of your thinking, raising
your chances of achieving your goals. Good luck!
ONE FINAL WORD FROM US
If this book has helped you in any way, we’d appreciate it if you left a review on Amazon. Reviews are the lifeblood of our business. We read every
single one and incorporate your feedback in developing future book projects.
The most successful people in life are those who enjoy learning and asking questions, understanding themselves and the world around them.
We created the Thinknetic Community so that like-minded individuals could share ideas and learn from each other.
It’s 100% free.
Besides, you’ll hear first about new releases and get the chance to receive discounts and freebies.
You can join us on Facebook by clicking the link below:
1. Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills . Chantilly, Va.:
The Teaching Company.
2. Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human Reasoning In Everyday
Life . New York: The Free Press.
3. The Foundation For Critical Thinking (2019). Critical Thinking: Where To Begin . Available at:
https://www.criticalthinking.org/pages/critical-thinking-where-to-begin/796 (Accessed: 14th December
2020)
4. Taber, C.S. & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American
Journal Of Political Science , 50(3), pp. 755-769. doi: 1540-5907.2006.00214.x
5. Stanovich, K.E., West, F.R., Toplak, M.E. (2013). Myside Bias, Rational Thinking, and Intelligence.
Current Directions in Psychological Science , 22(4) pp. 259–264. doi: 10.1177/0963721413480174
6. Russell, J. A. (2003). Core affect and the psychological construction of emotion. Psychological Review ,
110 (1), 145–172. doi: 10.1037/0033-295X.110.1.145
7. Kozlowski et al Kozlowski, D., Hutchinson, M., Hurley, J., Rowley, J., Sutherland, J. (2017). The role of
emotion in clinical decision making: an integrative literature review. BMC Medical Education , 17(1),
p255. doi: 10.1186/s12909-017-1089-7
8. Rauscher, F.H., Shaw, G.L. & Ky, K.N. (1993). Music and spatial task performance. Nature , 365, p611.
doi: 10.1038/365611a0
9. Nantais, K. & Schellenberg, G.E. (1999). The Mozart effect: an artifact of preference. Psychological
Science 10(4), pp370-373. doi: 10.1111/1467-9280.00170
10. Pietschnig, J., Voracek, M., & Formann, A. K. (2010). Mozart effect–Shmozart effect: A meta-analysis.
Intelligence , 38(3), pp314–323. doi: 10.1016/j.intell.2010.03.001
11. Kahneman, D. (2011). Thinking, Fast And Slow . New York: Farrar, Straus and Giroux.
12. Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense (2nd ed.).
Belmont, CA.: Thomson/Wadsworth.
13. Frank M.C., Vul E., Johnson S.P. (2009). Development of infants' attention to faces during the first year.
Cognition , 110 (2), pp160-170. doi: 10.1016/j.cognition.2008.11.010.
14. Bower, G. H., Monteiro, K. P., & Gilligan, S. G. (1978). Emotional mood as a context for learning and
recall. Journal of Verbal Learning & Verbal Behavior , 17(5), pp573–585. doi: 10.1016/S0022-
5371(78)90348-1.
15. Bower, G. H. (1981). Mood and memory. American Psychologist, 36 (2), pp129–148. doi:
10.1037/0003-066X.36.2.129
16. Heck, P.R., Simons, D.J., Chabris, C.F. (2018) 65% of Americans believe they are above average in
intelligence: Results of two nationally representative surveys. PLoSONE, 13 (7), e0200103. doi:
10.1371/journal.pone.0200103
17. Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science , 185 ,
pp1124-1130. doi: 10.1126/science.185.4157.1124
18. Russell, J.A. (2003) Core affect and the psychological construction of emotion. Psychological Review ,
110 (1), pp145-172 doi: 10.1037/0033-295x.110.1.145
19. Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in
probability judgment. Psychological Review , 90 , 293-315. doi:10.1037/0033-295X.90.4.293
20. Yap, A. (2013) Ad Hominem Fallacies, Bias, and Testimony. Argumentation, 27 (2), pp97-109. doi:
10.1007/s10503-011-9260-5
21. Walton, D.N. (1987) The ad Hominem argument as an informal fallacy. Argumentation, 1 , pp317–331.
doi: 10.1007/BF00136781
22. Walton, D. (1999) Rethinking the Fallacy of Hasty Generalization. Argumentation, 13 , pp161–182. doi:
10.1023/A:1026497207240
23. Law, S (2006) Thinking tools: The bandwagon fallacy. Think , 4 (12), pp. 111. doi:
10.1017/S1477175600001792
24. Asch, S. E. (1956). Studies of independence and conformity: I. A minority of one against a unanimous
majority. Psychological Monographs: General and Applied , 70 (9), 1–70. doi: 10.1037/h0093718
25. Sternberg, R.J. & Halpern, D.F. (Eds.) (2020) Critical Thinking In Psychology (2nd Ed.). Cambridge,
UK: Cambridge University Press.
26. Fiol, C.M. & O'Connor, E.J. (2003). Waking up! Mindfulness in the face of bandwagons. Academy of
Management Review , 28 , pp 54-70. doi: 10.5465/AMR.2003.8925227.
27. Rosenkopf, L., Abrahamson, E. (1999). Modeling Reputational and Informational Influences in
Threshold Models of Bandwagon Innovation Diffusion. Computational & Mathematical Organization
Theory , 5 , pp361–384 doi: 10.1023/A:1009620618662.
28. Nickerson, R. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General
Psychology , 2 , pp175-220. doi: 10.1037/1089-2680.2.2.175
29. Wilson, T. D., Houston, C. E., Etling, K. M., & Brekke, N. (1996). A new look at anchoring effects:
Basic anchoring and its antecedents. Journal of Experimental Psychology: General , 125 , pp387-402.
doi: 10.1037/0096-3445.125.4.387
30. Ross, L., Greene, D., House, P. (1977) The “false consensus effect”: An egocentric bias in social
perception and attribution processes. Journal of Experimental Social Psychology , 13 (3), pp279-301.
doi: 10.1016/0022-1031(77)90049-X.
31. Marks, G., & Miller, N. (1987). Ten years of research on the false-consensus effect: An empirical and
theoretical review. Psychological Bulletin , 102 (1), 72–90. doi: 10.1037/0033-2909.102.1.72
32. Gilovich, T. (1990). Differential construal and the false consensus effect. Journal of Personality and
Social Psychology , 59 (4), pp623–634. doi: 10.1037/0022-3514.59.4.623
33. Nisbett, R. E., & Wilson, T. D. (1977). The halo effect: Evidence for unconscious alteration of
judgments. Journal of Personality and Social Psychology , 35 (4), pp250–256. doi: 10.1037/0022-
3514.35.4.250
34. Baddeley, A (1997). Human Memory: Theory And Practice . (Revised Ed.). Hove, UK: Psychology
Press.
35. Tulving, E. (1983). Elements Of Episodic Memory . New York: Oxford University Press.
36. Festinger, L. (1957). A Theory Of Cognitive Dissonance . Stanford, CA: Stanford University Press.
37. Miller, M.K., Clark , J.D., Jehle, A. (2015) Cognitive Dissonance Theory (Festinger). In: The Blackwell
Encyclopaedia Of Sociology. doi.org/10.1002/9781405165518.wbeosc058.pub2
38. Little, R.J., D'Agostino, R., Cohen, M.L., Dickersin, K., Emerson, S.S., Farrar, J.T., Frangakis, C.,
Hogan, J.W., Molenberghs, G., Murphy, S.A., Neaton, J.D., Rotnitzky, A., Scharfstein, D., Shih, W.J.,
Siegel, J.P., Stern, H. (2012) The prevention and treatment of missing data in clinical trials. New
England Journal Of Medicine , 367 (14), pp1355-60. doi: 10.1056/NEJMsr1203730
39. Ayton, P., & Fischer, I. (2004) The hot hand fallacy and the gambler’s fallacy: Two faces of subjective
randomness? Memory & Cognition , 32 , pp1369–1378. doi: 10.3758/BF03206327
40. The Editors of Encyclopaedia Britannica (2016). Verifiability Principle . Encyclopædia Britannica.
Available at https://www.britannica.com/topic/verifiability-principle (Accessed January 15, 2021)
41. American Institute Of Physics (2018). Science Strategies Chart Course for Detecting Life on Other
Worlds https://www.aip.org/fyi/ 2018/science-strategies-chart-course-detecting-life-other-worlds
(Accessed 1 February 2021)
42. Ayer, A. J. (1936). Language, Truth, And Logic . London, UK: V. Gollancz.
43. Shankar, S. (2017) Verifiability And Falsifiability As Parameters For Scientific Methodology.
International Journal of Education & Multidisciplinary Studies , 7 (2), pp130-137. doi:
10.21013/jems.v7.n2.p10
44. Popper, K. (1963) Conjectures And Refutations: The Growth Of Scientific Knowledge . London, UK:
Routledge & Kegan Paul.
45. Neyman, J.; Pearson, E. S. (1933). The testing of statistical hypotheses in relation to probabilities a
priori. Mathematical Proceedings of the Cambridge Philosophical Society , 29 (4), pp492–510. Doi:
10.1017/s030500410001152x.
46. Schupbach, J., & Sprenger, J. (2011). The Logic of Explanatory Power. Philosophy of Science , 78 (1),
pp105-127. doi:10.1086/658111
47. Arditti, J., Elliott, J., Kitching, I. & Wasserthal, L. (2012). ‘Good Heavens what insect can suck it’–
Charles Darwin, Angraecum sesquipedale and Xanthopan morganii praedicta. Botanical Journal of the
Linnean Society , 169 , pp403–432. doi: 10.1111/j.1095-8339.2012.01250.x.
48. Grafman, J. (2000) Conceptualizing functional neuroplasticity. Journal of Communication Disorders , 33
(4), 345-356, doi: 10.1016/S0021-9924(00)00030-7.
49. Liu, D.W.C. (2012) Science Denial and the Science Classroom. CBE - Life Sciences Education , 11 (2)
pp129-134.
50. Sagan C. (1987) The Burden Of Skepticism. Skeptical Inquirer , 12 (1)
https://skepticalinquirer.org/1987/10/the-burden-of-skepticism/
51. Dwyer, C. (2017). Critical Thinking: Conceptual Perspectives and Practical Guidelines . Cambridge:
Cambridge University Press. doi:10.1017/9781316537411
52. Truzzi, M. (1987) On Pseudo-Skepticism. Zetetic Scholar , 12/13 , pp3-4.
53. Snelson, J.S. (1992) The Ideological Immune System: Resistance to New Ideas in Science. Skeptic , 1
(4).
54. Çakici, D., Metacognitive Awareness and Critical Thinking Abilities of Pre-Service EFL Teachers,
Journal of Education and Learning , 7 (5) pp116-129. doi: 10.5539/jel.v7n5p116
55. Flavell, J. (1979). Metacognition and Cognitive Monitoring: A New Area of Cognitive-Developmental
Inquiry. American Psychologist , 34 , 906-911.
56. Schraw, G. (1998) Promoting general metacognitive awareness. Instructional Science, 26 , pp113–125.
doi: 10.1023/A:1003044231033
57. Paul, R & and Elder, L. (2013) Critical Thinking: Intellectual Standards Essential to Reasoning Well
Within Every Domain of Human Thought, Part Two. Journal Of Developmental Education , 37 (1).
58. Wilterdink, N. (2002). The sociogenesis of postmodernism. European Journal of Sociology , 43 (2),
pp190-216.
59. Duignan, B. (2020) Postmodernism. Encyclopedia Britannica ,
https://www.britannica.com/topic/postmodernism-philosophy. (Accessed 22 January 2021) .
60. Dennett, D.C. (2013). On Wieseltier V. Pinker in The New Republic: Let's Start With A Respect For
Truth. Edge , https://www.edge.org/conversation/daniel_c_dennett-dennett-on-wieseltier-v-pinker-in-the-
new-republic. (Accessed 22 January 2021) .
61. Kuhn, T. S. (1970). The structure of scientific revolutions (2nd Ed.). University of Chicago Press:
Chicago.
62. Watson, C.A. (2018) Information Literacy in a Fake/False News World: An Overview of the
Characteristics of Fake News and its Historical Development. International Journal of Legal
Information , 46 (2), pp. 93-96.
63. McDermott, R (2019) Psychological Underpinnings of Post-Truth in Political Beliefs. Political Science
& Politics , 52 (2), pp218-222.D
64. Vosoughi, S., Roy, D., Aral, S. (2018) The spread of true and false news online. Science , 369 , pp1146-
1151 doi: 10.1126/science.aap9559
65. Aral, S. & Van Alstyne, M.W. (2011). The Diversity-Bandwidth Tradeoff. American Journal of
Sociology , 117 (1), doi: 0.2139/ssrn.958158
66. Itti, L. & Baldi, P. (2009). Bayesian surprise attracts human attention, Vision Research , 49 (10), pp1295-
1306. doi: 10.1016/j.visres.2008.09.007.
67. Vuilleumier P. (2005). How brains beware: neural mechanisms of emotional attention. Trends In
Cognitive Science , 9 (12), pp585-94. Doi: 10.1016/j.tics.2005.10.011
68. Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and
Its Correction: Continued Influence and Successful Debiasing. Psychological Science in the Public
Interest, 13 (3), 106–131. doi: 10.1177/1529100612451018
69. Osborne, C.L. (2018). Programming to Promote Information Literacy in the Era of Fake News.
International Journal of Legal Information , 46 (2), pp101-109. doi: 10.1017/jli.2018.21
70. LaGarde, J. & Hudgins, D. (2018) Fact Vs. Fiction: Teaching Critical Thinking Skills in the Age of Fake
News . International Society for Technology in Education.
71. Niedringhaus, K.L (2018). Information Literacy in a Fake/False News World: Why Does it Matter and
How Does it Spread? International Journal of Legal Information , 46 (2), pp97-100.
doi:10.1017/jli.2018.26
72. Murch, S.H., Anthony, A., Casson, D.H., Malik, M., Berelowitz, M., Dhillon, A.P., Thomson, M.A.,
Valentine, A., Davies, S.E., Walker-Smith, J.A. (2004) Retraction of an interpretation. Lancet . 363
(9411):750. doi: 10.1016/S0140-6736(04)15715-2. Erratum for: Lancet . 1998 Feb 28;351(9103):637-
41.
73. Shearer, E. & Gottfried, J. (2017). News Use Across Social Media Platforms 2017, Pew Research Center
. https://www.journalism.org/2017/09/07/news-use-across-social-media-platforms-2017/
74. Blakeslee, Sarah (2004) "The CRAAP Test," LOEX Quarterly , 31 (3 ). Available at:
commons.emich.edu/loexquarterly/vol31/iss3/4
75. Fielding, J.A. (2019) Rethinking CRAAP: Getting students thinking like fact-checkers in evaluating web
sources. College & Research Libraries News, 80 (11), pp.620-622. doi: 10.5860/crln.80.11.620
76. Wineburg, S. & Mcgrew, S. (2017) Lateral Reading: Reading Less and Learning More When Evaluating
Digital Information. Stanford History Education Group Working Paper No. 2017-A1 , Available at
http://dx.doi.org/10.2139/ssrn.3048994
77. Edelman trust barometer 2021. Available at https://www.edelman.com/sites/g/files/aatuss191/files/2021-
01/2021-edelman-trust-barometer.pdf
78. Society of Professional Journalists (2014). SPJ Code Of Ethics . https://www.spj.org/ethicscode.asp
[accessed 12 Feb 2021]
79. Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And Marriage . New York:
W.W. Norton.
80. Vrij, A. (2004). Guidelines to catch a liar. In P. Granhag & L. Strömwall (Eds.), The Detection of
Deception in Forensic Contexts (pp. 287-314). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511490071.013
81. DePaulo, B., & Morris, W. (2004). Discerning lies from truths: Behavioural cues to deception and the
indirect pathway of intuition. In P. Granhag & L. Strömwall (Eds.), The Detection of Deception in
Forensic Contexts (pp. 15-40). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511490071.002
82. Arciuli, J., Mallard, D., & Villar, G. (2010). “Um, I can tell you're lying”: Linguistic markers of
deception versus truth-telling in speech. Applied Psycholinguistics , 31 (3), pp397-411.
doi:10.1017/S0142716410000044
83. Rockwell, P., Buller, D., & Burgoon, J. (1997). Measurement of deceptive voices: Comparing acoustic
and perceptual data. Applied Psycholinguistics, 18 (4), 471-484. doi:10.1017/S0142716400010948
84. Bull, R. (2004). Training to detect deception from behavioural cues: Attempts and problems. In P.
Granhag & L. Strömwall (Eds.), The Detection of Deception in Forensic Contexts (pp. 251-268).
Cambridge: Cambridge University Press. doi:10.1017/CBO9780511490071.011
85. Knapp, M. (2006). Lying and Deception in Close Relationships. In A. Vangelisti & D. Perlman (Eds.),
The Cambridge Handbook of Personal Relationships , pp. 517-532). Cambridge: Cambridge University
Press. doi:10.1017/CBO9780511606632.029
86. Levine, T.R. (2014). Truth-default Theory (TDT): A Theory of Human Deception and Deception
Detection. Journal of Language and Social Psychology, 33 , pp378-92. doi:
10.1177/0261927X14535916
87. Serota, K.B., Levine, T. & Boster, F.J. (2010). The Prevalence of Lying in America: Three Studies of
Self-Reported Lies. Human Communication Research, 36 , pp2-25
88. Levine, T.R. (2015). New and Improved Accuracy Findings in Deception Detection Research. Current
Opinion in Psychology , 6 , pp1-5 doi: 10.1016/j.copsyc.2015.03.003.
89. Clough, J. (2010). Fraud. In Principles of Cybercrime (pp. 183-220). Cambridge: Cambridge University
Press. doi:10.1017/CBO9780511845123.008
90. Hancock, P. (2015). The Psychology of Deception. In Hoax Springs Eternal: The Psychology of
Cognitive Deception (pp. 61-71). Cambridge: Cambridge University Press.
91. Federal Trade Commission (2020). How To Avoid A Scam . https://www.consumer.ftc.gov/articles/how-
avoid-scam [Accessed 7 February 2021]
92. Citizens Advice (2019) Check If Something Might Be A Scam.
https://www.citizensadvice.org.uk/consumer/scams/ check-if-something-might-be-a-scam/[Accessed 7
February 2021]
93. NSW Government. Misleading Representations And Deceptive Conduct .
https://www.fairtrading.nsw.gov.au/buying-products-and-services/advertising-and-pricing/misleading-or-
deceptive-conduct [Accessed 13 February 2021]
94. Evon, D. (2015) Natural repellent for spiders? Snopes.com. Available at https://www.snopes.com/fact-
check/walnut-and-spiders/#:~:text=Lastly ,%20the%20idea%20that%20spiders%20are% [Accesed 6
February 2021]
95. Harrington, M. (2020). The Varieties of Scientific Experience . In The Design of Experiments in
Neuroscience (pp. 1-12). Cambridge: Cambridge University Press. doi:10.1017/9781108592468.002
96. Gauch, H. (2012). Scientific Method in Brief . Cambridge: Cambridge University Press.
doi:10.1017/CBO9781139095082
97. Bensley, D. (2020). Critical Thinking and the Rejection of Unsubstantiated Claims . In R. Sternberg &
D. Halpern (Eds.), Critical Thinking in Psychology ( pp. 68-102). Cambridge: Cambridge University
Press. doi:10.1017/9781108684354.005
98. Narlikar, J. (2005). Astronomy, Pseudoscience, and Rational Thinking. Highlights of Astronomy , 13 ,
1052-1054. doi:10.1017/S1539299600018116
99. Percy, J., & Pasachoff, J. (2005). Astronomical pseudosciences in North America . In J. Pasachoff & J.
Percy (Eds.), Teaching and Learning Astronomy: Effective Strategies for Educators Worldwide (pp. 172-
176). Cambridge: Cambridge University Press. doi:10.1017/CBO9780511614880.026
100. Landrum, A.R. & Olshansky, A. (2019) The role of conspiracy mentality in denial of science and
susceptibility to viral deception about science. Politics and the Life Sciences , 38 (2), pp193-209
101. Lakatos, I. (1978). Introduction: Science and pseudoscienc e. In J. Worrall & G. Currie (Eds.), The
Methodology of Scientific Research Programmes: Philosophical Papers (pp. 1-7). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511621123.002
102. Bridgstock, M. (2009). Modern skepticism . In Beyond Belief: Skepticism, Science and the Paranormal
(pp. 86-110). Cambridge: Cambridge University Press. doi:10.1017/CBO9780511691676.006
103. Sagan, C. (1997). The Demon-Haunted World . London: Headline.
DISCLAIMER
Thе іnfоrmаtіоn соntаіnеd іn this book аnd іtѕ соmроnеntѕ, іѕ meant to ѕеrvе аѕ a соmрrеhеnѕіvе соllесtіоn оf
ѕtrаtеgіеѕ thаt thе аuthоr оf thіѕ bооk hаѕ dоnе rеѕеаrсh аbоut. Summаrіеѕ, ѕtrаtеgіеѕ, tірѕ аnd trіс kѕ аrе оnlу
rесоmmеndаtіоnѕ bу thе аuthоr, аnd rеаdіng thіѕ bооk wіll nоt guаrаntее thаt оnе’ѕ rеѕultѕ wіll еxасtlу mіrrоr thе
аuthоr’ѕ rеѕultѕ.
Thе аuthоr оf thіѕ bооk hаѕ mаdе аll rеаѕоnаblе еffоrtѕ tо рrоvіdе сurrеnt аnd ассurаtе іnfоrmаtіоn fоr thе rеаdеrѕ
оf thіѕ bооk. Thе аuthоr аnd іtѕ аѕѕосіаtеѕ wіll nоt bе held liable for аnу unіntеntіоnаl еrrоrѕ оr оmіѕѕіоnѕ thаt
mау bе fоund.
Thе mаtеrіаl іn thе bооk mау іnсludе іnfоrmаtіоn by third раrtіеѕ. Third pаrtу mаtеrіаlѕ соmрrіѕе оf орiniоnѕ
еxрrеѕѕеd bу thеіr оwnеrѕ. Aѕ ѕuсh, thе аuthоr оf thіѕ bооk dоеѕ nоt аѕѕumе rеѕроnѕіbіlіtу оr lіаbіlіtу fоr аnу
thіrd раrtу mаtеrіаl оr оріnіоnѕ.
Thе рublісаtіоn оf thіrd раrtу mаtеrіаl dоеѕ nоt соnѕtіtutе thе аuthоr’ѕ guаrаntее оf аnу іnfоrmаtіоn, рrоduсtѕ,
ѕеrvісеѕ, оr оріnіоnѕ соntаіnеd wіthіn third раrtу mаtеrіаl. Uѕе оf thіrd раrtу mаtеrіаl dоеѕ nоt guаrаntее thаt
уоur rеѕultѕ wіll mіrrоr our rеѕultѕ. Publісаtіоn оf ѕuсh thіrd раrtу mаtеrіаl іѕ ѕіmрlу a rесоmmеndаtіоn аnd
еxрrеѕѕіоn оf thе аuthоr’ѕ оwn оріnіоn оf thаt mаtеrіаl.
Whеthеr bесаuѕе оf thе рrоgrеѕѕiоn оf thе Intеrnеt, оr thе unfоrеѕееn сhаngеѕ іn соmраnу роlісу аnd еdіtоrіаl
ѕubmіѕѕіоn guіdеlіnеѕ, whаt іѕ ѕtаtеd аѕ fасt аt thе tіmе оf thіѕ wrіtіng mау bесоmе оutdаtеd оr іnаррlісаblе lаtеr.
Thіѕ book іѕ соруright ©2021 bу Thinknetic with all rіghtѕ rеѕеrvеd. It іѕ illegal to rеdіѕtrіbutе, сору, оr сrеаtе
dеrіvаtіvе wоrkѕ frоm thіѕ bооk whоlе оr іn раrtѕ. Nо раrtѕ оf thіѕ rероrt mау bе rерrоduсеd оr rеtrаnѕmіttеd іn
аnу fоrmѕ whаtѕоеvеr wіthоut thе wrіttеn еxрrеѕѕеd аnd ѕіgnеd реrmіѕѕіоn frоm thе author.