Professional Documents
Culture Documents
Critical Thinking 1 - Analytical Skills PDF
Critical Thinking 1 - Analytical Skills PDF
Critical Thinking
Skills for Engineers–
BOOK 1: ANALYTICAL SKILLS
by Sridhar Ramanathan
Copying this material in any form is not permitted without prior written approval from IEEE/
IEEE-USA.
This IEEE-USA publication is made possible through funding by a special dues assessment
of IEEE members residing in the United States.
TABLE OF CONTENTS
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2
Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5
Information Seeking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .7
Interpretation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Judgment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Questioning Evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Skepticism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
W
e live in times when facts, claims, opinions, and even data are
vying for our attention—ultimately aiming to drive us to some
desired action—to purchase something, go somewhere, vote for
someone, experience something, build something, etc. If we’re not careful,
we are acting on someone else’s best interests—not necessarily your own.
And if you are an engineer, or in a technical field, then critical thinking is all
the more important to delivering the most effective and potentially novel,
break-through solution you can. In this book series, we explore the key
aspects of critical thinking: analytical skills, data analysis, interpretation,
judgment, questioning evidence, recognizing similarities and differences,
creativity, communication, and skepticism.
C
harles Kettering, the head of research for GM, once said: “A problem well-
stated is half-solved.”1 Therefore, it’s critical to ask the right questions,
so you focus engineers/staff on solving the right problem(s). Let’s use an
example to explore how to frame thoughtful questions.
Before engineers leap to answer that question, it’s helpful first to ask why it’s
important to increase battery capacity. You’ll soon find that the real need is
actually reduced time between charges—and that certainly one way to achieve
that, is by increasing battery capacity. Instead of framing the question so
narrowly on battery capacity, the VP might have asked “How do we extend the
time between battery re-charges by 25 percent?”
Engineers and their managers should generate a few more “how” and “why”
questions that point to possible solutions, or even unexpected breakthroughs.
For example:
• How can we encourage users to make small tweaks (e.g., turning down
the brightness, deleting unused apps, etc.) that save power?
• How can we improve the efficiency of the battery itself to hold charges longer?
1 https://www.brainyquote.com/quotes/charles_kettering_181210
• How can our battery supplier squeeze out longer storage times?
• Why couldn’t we use the phone itself to generate power (e.g., maybe a
thin solar cell on the back)?
I
nvestorWords, offers a good working definition of data analysis, as
follows: “The process of extracting, compiling and modeling raw data—
for purposes of obtaining constructive information that can be applied
to formulating conclusions, predicting outcomes, or supporting decisions
in business, scientific and social science settings.”2
What do you conclude from this data? It’s clear the two data sets are
highly correlated but can we prove that the factor in Data Set 1 is actually
causing the figures in Data Set 2? No. We cannot.
It’s entirely possible that a third underlying factor is causing Data Sets
1 and 2 to have this response curve. A real-world example of this factor
might be that Data Set 1 is blood pressure and Data Set 2 is blood sugar
level. Does high blood pressure cause blood sugar to go up, or vice versa?
Maybe. What if it’s actually a cup of coffee that is the underlying cause of
both blood pressure and blood sugar going up?
2 http://www.investorwords.com/19279/data_analysis.html
As you can see, the factor behind Data Set 1 is causing an impact in Data
Set 2, with a lag time of about four time periods. The shape of the curve is
the same, but offsets some time periods. Continuing with the blood pressure
example, Data Set 1 could be the rise of caffeine in the blood system, causing
a rise in heart rate and blood pressure, shortly after drinking the coffee.
As with the previous chapter, the key here is to ask critical questions to test
the strength of your conclusions.
Tip: Test your conclusion by first asking: How do I know this conclusion
isn’t just correlation versus a causal relationship? What other factors
could be behind the data causing the observed results?
T
here’s a famous passage in “Alice in Wonderland,” in a dialogue between
the Cheshire Cat and Alice, which goes as follows:
• Who?—Which users seem most frustrated with battery life, and are most
vocal about it (e.g., mobile professionals)?
The third step in information seeking is to build a few stories, or use cases
that bring these data points together into compelling narratives that point
to an action or decision. We won’t create the story here, but the questions
above have already begun to paint a picture in your mind about why and
how extending battery life might be a big win for mobile professionals.
Take a look again at the questions we generated in Chapter 1. Note how
useful they are in focusing our information collection process to solving the
battery life challenge. Remember, we’re not yet ready to interpret these
stories, vet them, and come to a final decision. Those steps come later in
our critical thinking journey.
O
nce you have a data set—whether it’s qualitative or quantitative data, you
now have information on which to begin formulating some hypotheses, and
even a conclusion. This phase is very delicate in decision-making, because
it’s tempting to leap to conclusions that may not, in fact, be valid. One such trap
is confirmation bias—the conscious or unconscious preference placed on data that
supports a belief, hypothesis, or conclusion you already have in mind.
The VP of Marketing might genuinely believe the rapid growth in revenue was
due to aggressive and wise investment in digital social media advertising.
This conclusion would certainly make the executive look savvy in the eyes of
fellow executives. But what if the company had other data that contradicted
this conclusion? What if, on closer inspection of many more variables affecting
revenue, that the biggest driver of revenue turned out to be a large technology
partner that was sending over leads directly to the software company’s sales
force, and that very few of the deals actually came from the Marketing Team’s
digital ad campaigns? Here’s the data set that might cast doubt on the VP of
Marketing’s claim:
• Are their outliers, or data points that I shouldn’t dismiss too quickly, but
rather should investigate a little further—for a possible surprising insight?
Tip: To avoid confirmation bias, interpret data with the aim of generating
rich hypotheses, rather than defending a pre-existing belief.
W
e take for granted that engineers are using their best judgment
when they build airplanes, bridges, dams, buildings, electric grids,
and surgical equipment. Poor judgment could result in loss of life,
so it’s critical engineers hone their judgment skills. How does one develop
good judgment? To begin with, judgment is making a considered choice.
By “considered,” we mean weighing a number of factors—some of which
may even be in conflict. Let’s look at an example of an engineer in charge
of designing an extensive solar panel for a large public high school. What
factors should go into the final design? Here are eight to consider:
• Costs—Both the initial purchase price and the ongoing maintenance costs
• Aesthetics—Will the panel fit well with the architecture of the school?
• Future proof—Is the solar panel based on technology that will very soon
be rendered obsolete by a next-generation capability?
O
ne of the important dimensions of critical thinking is questioning the
evidence upon which to base an engineering decision or action. Let’s
take a look at an example from astrophysics—where Cecilia Payne-
Gaposchkin, a Harvard Ph.D. candidate, questioned the conventional wisdom
that the sun and earth had similar elemental composition (e.g., carbon, oxygen,
nitrogen, iron, nickel, etc.). In her 1925 doctoral thesis, Payne-Gaposchkin
took a more critical look at the spectral data, and the relationship between the
sun’s temperature, and its spectral class, to arrive at a startling conclusion:
the sun was a million times more abundant in hydrogen and helium than other
elements. While it’s not a surprise to us today to know that the sun is a burning
ball of mostly hydrogen gas, it was absolutely not the prevailing wisdom at the
time. In fact, her advisor and peers summarily rejected her findings. She was
ultimately vindicated five years later, when another astronomer revisited her
conclusions, and endorsed them in his own journal article. It took years for the
history books to give Payne-Gaposchkin the proper credit she deserved. What
we see here, is that there’s always room for a fresh look at seemingly “conven-
tional wisdom,” based on well-established evidence.
S
ome of the best science and engineering accomplishments come from
recognizing something that is different from, or out of, the ordinary.
Let’s take a look at another historical example. Stephanie Kwolek was
an American chemist. Kwolek worked at DuPont in 1964, when she discovered
Kevlar—a polymer that was heat resistant, five times stronger than steel, and
lighter than fiberglass.3 At first, Kwolek and her research co-workers thought
the polymer was a mistake, because they had been looking for a substance
that would help tires to be stronger and lighter weight. The cloudy solution
she created was normally thrown away—because it was different from all the
other possible tire chemicals. But Stephanie saw that this difference could
have very intriguing, important applications. Thankfully, her department head
concurred with Stephanie’s realization that this substance was important.
Soon, Kevlar was commercialized—to great success—in products ranging from
bullet proof vests, to tennis racket strings, to parachute cables—and more.
What can we learn from Stephanie Kwolek, Scott Farquhar and Mike
Cannon-Brookes? The main lesson is that there is power in differences.
As an engineer, you will be called on many times to build something that
might be similar to what is already on the market. The challenge is to
press hard to build something that is unique, different, and individualized
from competing solutions.
3 https://www.sciencehistory.org/historical-profile/stephanie-l-kwolek
4 Source: Wikipedia on Atlassian co-founders: https://en.wikipedia.org/wiki/Atlassian
W
e covered the importance of questioning evidence in Chapter
Six, but let’s look at the underlying quality that engineers
must embrace when critically examining data and preparing
conclusions: skepticism. Here, we’ll look at the story of Abraham Wald,
a Hungarian mathematician who challenged the conclusions of World
War II Navy statisticians. Wald argued that the pattern of bullet holes
in aircraft pointed to a need to reinforce metal in certain parts of the
craft—to improve survivability from gunfire.5 He was skeptical that this
conclusion was the right one. Wald argued the statisticians were only
looking at bullet-hole distributions on aircraft that actually survived their
missions—rather than the ones that were shot down—and therefore
not available for inspection. So, he proposed a radical conclusion to the
Navy: Shore up the metal on parts of the aircraft, where there were no
bullet holes. Wald reasoned that if there was a bullet hole in a particular
spot, then the plane would have been shot down. As it turned out, he
was right. The survival rate of military aircraft went up, and Abraham
Wald was heralded as a genius.
• What is in the intrinsic bias or limitation of the data that might skew my
conclusions (e.g., note how Abraham Wald was the first to notice that the
military data set did not include the aircraft that were shot down)?
• Does the conclusion seem “too good to be true,” or perhaps based more
on beliefs—rather than hard evidence?
5 Source: Wikipedia on Abraham Wald; https://en.wikipedia.org/wiki/Abraham_Wald
6 Source: Wikipedia on Richard Feynman and the Challenger disaster: http://www.feynman.com/science/
the-challenger-disaster/
I
hope you’ve found this e-book useful in honing your critical thinking
skills—not only for engineering—but for life, in general. Practice these
skills in everyday life; whether it’s reading a news article, discussing
technical issues at work, or making important life decisions. To recap, I
have reviewed a number of key elements to critical thinking, including:
• Analyzing data, and avoiding the trap of assuming causation, when only
correlation can be proven
S
ridhar Ramanathan has thirty years of experience in technology
companies, from startups to blue chip firms. As Managing Director
and Co-founder of Aventi Group, a product marketing agency, he
has been instrumental in leading many tech firms through high-growth
phases. Prior to Aventi Group, Ramanathan was the marketing executive for
Hewlett-Packard’s Managed Services business, where he was responsible for
marketing worldwide, and managing the portfolio of HP services’ 1.1 billion
dollar unit. He also held profit & loss responsibility for electronic messaging
outsourcing and e-service business units.
• http://www.investorwords.com/19279/data_analysis.html
• https://en.wikipedia.org/wiki/Cecilia_Payne-Gaposchkin
• https://www.sciencehistory.org/historical-profile/stephanie-l-kwolek
• https://en.wikipedia.org/wiki/Abraham_Wald
• http://www.feynman.com/science/the-challenger-disaster/