02 of ten
Critical Thinking
The art of not being fooled
In 1938, a radio drama based on H. G. Wells’s War of the Worlds aired in the United States. It described, in news-bulletin style, an alien invasion. It was clearly labelled as fiction. Newspapers the next morning reported widespread panic — listeners fleeing their homes, jamming switchboards, convinced the world was ending.
The interesting part is not that some people were fooled. The interesting part is that the historical scale of that panic has itself, since, been shown to be largely a myth. Newspapers exaggerated it to attack the new medium of radio. So the lesson goes deeper than “people were gullible in 1938.” The lesson is that even the story about how gullible people were turned out to be a story we should have questioned.
This is the world you have been born into. Claims travel faster than checks. The same image is shared a million times before anyone asks where it came from. AI now generates words and images and voices that look real. The skill of pausing — of asking is this actually true? — has gone from useful to essential.
The core principles
Distinguish between the claim, the reasoning, and the evidence. When someone tells you something, three different things have to hold for it to be reliable. The claim has to be clearly stated. The reasoning has to be valid. And the evidence has to actually support it. People often hide weak evidence behind strong claims, or weak reasoning behind a confident tone. Train yourself to separate the three.
Notice the difference between explanation and proof. I had a fever and took herbal tea, then the fever went away — therefore the tea cured me. This is an explanation. It is not a proof. Most fevers go away. Whether the tea did anything is a different question entirely, one that requires a comparison group: people with fevers who didn’t take the tea. The mind generates explanations effortlessly; proofs are harder and rarer.
Watch your motivated reasoning. Your brain has a strong tendency to find arguments for whatever you already wanted to believe. This is not a flaw to be ashamed of; it is how all human brains work. The way to handle it is not to fight it in the moment, but to install a check: when you notice that a piece of evidence is very convenient for what you wanted to do anyway, that is the moment to look extra hard at it.
Steelman before you criticize. Before disagreeing with someone, can you state their position so well that they would say yes, that is exactly what I believe? If you cannot, you do not yet understand it. Most arguments are between two people fighting strawmen. Real progress happens when you can attack the strongest version of the other side.
The first principle is that you must not fool yourself, and you are the easiest person to fool. — Richard Feynman
Common patterns of bad reasoning
A handful of failure modes account for most flawed thinking. Knowing their names helps you spot them.
- Survivorship bias — you study the winners and copy their habits, not noticing that the losers had the same habits and lost anyway.
- Confirmation bias — you remember the evidence that supports your view and forget the evidence against.
- Availability bias — you overweight whatever you can easily picture (plane crashes feel common; car crashes feel mundane, though the latter kill far more).
- Authority bias — a confident expert in one field gets believed about every field, including ones they know nothing about.
- Sunk-cost trap — you keep investing in a bad path because you’ve already invested a lot, instead of asking would I start this today, fresh?
Naming the pattern is half the cure. Once you can say that is survivorship bias, you have created a small distance between you and the trap.
- Practice the phrase "How would I know if I were wrong?" Apply it to your own beliefs. If you cannot name what evidence would change your mind, you are not actually believing — you are committing.
- When something circulates online, find the source. Trace it back. Is the original source a study, a news report, a single tweet, a screenshot of a screenshot? Most viral claims dissolve within two clicks of effort.
- Read one piece a week from a perspective you disagree with. Steelman it. Write a one-paragraph summary of their best argument in their own framing. This stretches the muscle that polarized media tries to atrophy.
- Keep a "Things I was confidently wrong about" list. Add to it when it happens. Over years it teaches you a kind of humility that no lecture can.
Recommended resources
- Book Thinking, Fast and Slow by Daniel Kahneman — the foundational map of how minds make and unmake mistakes.
- Book The Demon-Haunted World by Carl Sagan — a passionate defense of skepticism and the scientific habit of mind.
- Book Calling Bullshit by Carl Bergstrom & Jevin West — modern guide to recognizing flawed claims, especially data-flavored ones.
- Course Coursera "Calling Bullshit" by the same authors — free, video format.
- Site RationalWiki and the LessWrong sequences — uneven but rich source on cognitive biases.
Start here
For the next claim you find yourself ready to share, pause and find its original source. If you cannot find one in two minutes, do not share it. Notice the discomfort that pause creates. That discomfort is the skill being built.