← Back to Thoughts
September 17, 20258 min

Rationalization vs Rationality: The P-Hacking Problem

Smart people don't automatically think better—they often just rationalize more sophisticatedly. Here's how to recognize when you're P-hacking your own beliefs.

critical-thinkingrationalizationcognitive-biasscientific-methodeducation

Scientists know about P-hacking: when researchers run multiple statistical tests on their data until one shows significance, then present that single result as if it was the only test they ran. It's intellectually dishonest because the appearance of following scientific method masks the reality of manipulating data until you get the result you wanted.

Most people recognize P-hacking as bad science. What they don't recognize is that they do the exact same thing with their own reasoning every single day. This is basically confirmation bias with more steps.

The Parallel

Bad Science (P-hacking)

  1. 1. Start with desired conclusion
  2. 2. Run multiple tests until one shows statistical significance
  3. 3. Ignore the tests that didn't support your hypothesis
  4. 4. Present the successful result as if it was your only test
  5. 5. Claim you "followed the evidence"

Bad Thinking (Rationalization)

  1. 1. Start with desired conclusion (usually from emotion or tribal identity)
  2. 2. Search for evidence and arguments that support it
  3. 3. Ignore or dismiss contradicting evidence
  4. 4. Present your conclusion as if you reasoned your way there objectively
  5. 5. Claim you're "thinking critically"

The structure is identical. The dishonesty is identical. The only difference is that with P-hacking, we can sometimes catch it by checking the researcher's methodology. With rationalization, the process happens inside your own head where nobody else can audit it—and where you've become very good at hiding it from yourself.

The Smart Person Problem

Do you know what makes this particularly insidious? Intelligence and education don't protect against rationalization. They make it worse.

Research by Dan Kahan at Yale shows that on politically charged issues, people with higher cognitive skills and scientific literacy are more polarized, not less. Smarter people aren't better at reaching accurate conclusions; they're better at constructing sophisticated justifications for the conclusions their tribe already holds.

This is the exact opposite of what we assume education does. We think teaching people to be smart makes them more rational. But what it often does is make them more skilled at rationalization—better at finding evidence for what they want to believe, better at dismissing counterarguments, better at constructing complex defenses of tribal positions.

A person with a PhD can rationalize more elegantly than someone without formal education, but they're still rationalizing. The complexity of the argument doesn't make it valid; it just makes the rationalization harder to spot.

⚠️

The Paradox: Greater cognitive ability can lead to worse thinking when deployed in service of defending rather than examining beliefs.

How Critical Literacy Made This Worse

In my research for The Perfect Storm, I traced how critical literacy education—well-intentioned efforts to teach students to "question everything" and "find the bias"—inadvertently weaponized skepticism without teaching intellectual discipline.

Students learned:

  • ☑︎ Question sources that contradict your worldview
  • Accept sources that confirm your worldview
  • ☑︎ Find bias in arguments you disagree with
  • Miss bias in arguments you agree with
  • ☑︎ Deconstruct texts critically
  • Construct your own arguments rigorously

The result is a generation that thinks they're thinking critically because they can tear down arguments. But tearing down isn't the same as building sound reasoning. Spotting others' bias isn't the same as recognizing your own. Being skeptical of opposing views isn't the same as being willing to examine your own position honestly.

Critical literacy taught the weapons of critical thinking without the discipline. It taught people to be critical of everything except their own reasoning process.

The Scientific Method vs Your Method

The scientific method has a built-in correction for rationalization: you must design experiments that could falsify your hypothesis. You have to actively look for ways you might be wrong.

Good scientists start with a hypothesis they're prepared to abandon. They design tests that would prove them wrong if they're wrong. They accept results even when those results contradict what they hoped to find. They actively seek the strongest counterarguments and engage with them honestly.

This is the discipline critical literacy forgot to teach: the requirement to try to prove yourself wrong.

Real rationality looks like this:

  • Acknowledge your starting intuition or preference
  • Actively seek evidence that might prove you wrong
  • Engage with the strongest counterarguments, not the weakest strawmen
  • Hold positions proportional to the strength of evidence
  • Update beliefs when evidence changes
  • Distinguish between what you know and what you want to be true

Most people do the opposite. They start with a conclusion (often adopted), search for supporting evidence, dismiss contradicting evidence, engage only with weak versions of opposing arguments, and never update their beliefs no matter what evidence appears.

That's rationalization. It looks like reasoning from the outside, but it's actually working backward from the conclusion you wanted to reach.

How to Catch Yourself

The hardest part of recognizing rationalization is that it feels exactly like rational thought from the inside. Your brain doesn't wave a flag saying "Hey, you're being intellectually dishonest right now!" It feels like you're carefully considering the evidence and reaching a sound conclusion.

You're probably rationalizing if:

  • You can't articulate the strongest version of the opposing view
  • You never change your mind on important issues despite new information
  • You immediately know your position on complex topics without needing to think
  • You feel defensive or angry when your position is challenged
  • You search for evidence to support your view rather than to test it
  • You dismiss contradicting evidence quickly while scrutinizing supporting evidence minimally
  • You focus on attacking the person making an argument rather than the argument itself
  • You can't specify what evidence would change your mind

You're probably reasoning if:

  • You can explain opposing views so well that advocates would recognize them
  • You've changed your mind on at least some important issues when evidence warranted it
  • You sometimes need time to think through your position on complex questions
  • You feel curious rather than defensive when challenged
  • You look for the strongest evidence against your current view
  • You apply the same skepticism to evidence you like and evidence you dislike
  • You engage with arguments on their merits regardless of who makes them
  • You can clearly state what evidence would change your mind

The difference isn't about being smart or educated. It's about honesty—being willing to follow the analysis wherever it leads, even when it's uncomfortable.

ℹ️

The Test: Can you argue the opposing position so well that its advocates would say "Yes, that's exactly what we believe"? If not, you probably don't understand it well enough to dismiss it.

The Credential Trap

One particularly dangerous form of rationalization is what I call the credential trap: smart, educated people believing they can determine truth through pure reasoning because they're smart and educated.

I see this constantly: people with advanced degrees who think their intelligence means they don't need to examine evidence carefully, engage with opposing views honestly, or acknowledge where they might be wrong. They've trained their whole lives to be right, to have the answers, to be the smartest person in the room. Admitting uncertainty feels like failure.

But intelligence without humility is just sophisticated rationalization. You can construct brilliant arguments for false conclusions. You can build elaborate theoretical frameworks that sound compelling but disconnect from reality. You can use your verbal facility to win arguments while being fundamentally wrong.

The credential doesn't make you right. It just makes your rationalization more convincing—to yourself and sometimes to others.

Humility and Honesty as Cure

The antidote to rationalization isn't trying harder to be smart. It's intellectual humility: accepting that you might be wrong, that your perspective is limited, that smart people disagree with you for reasons that might be valid.

This doesn't mean abandoning your positions or pretending all views are equally valid. It means holding your positions appropriately—with confidence proportional to the evidence, with awareness of counterarguments, with willingness to update when circumstances change.

Intellectual humility looks like:

  • "Here's what I think and why, but I could be wrong about X and Y"
  • "The strongest argument against my position is Z, and here's why I still find my view more compelling"
  • "I used to think A, but when I learned B, I changed my mind to C"
  • "I don't know enough about this topic to have a strong position"
  • "That's a good point I hadn't considered; let me think about it"

This isn't weakness. It's the mark of someone actually thinking rather than performing thought.

And honesty: being willing to admit (at least to yourself) when you're defending a position because it's your tribe's position, because it's comfortable, because changing would be costly—not because the evidence actually supports it.

Sometimes the honest thing is: "I know the evidence points toward X, but I'm not ready to accept X because it would require changing too much about how I see the world." That's at least truthful rationalization rather than disguised rationalization.

Why This Matters

We live in a time of unprecedented access to information and unprecedented tribalization of belief. People can find sophisticated arguments for virtually any position. The question isn't whether you can find smart people who agree with you—you can, for almost anything. The question is whether you're using your intelligence to discover truth or to confirm your beliefs.

The frameworks I focus on—incentive analysis, systems thinking, rational dysfunction—are powerful analytical tools. But they can be used for rationalization just as easily as for genuine inquiry. You can deploy the Incentive Quadrant to justify your existing position rather than to examine whether your position makes sense. You can use systems thinking to construct clever arguments for your predetermined conclusion.

The frameworks are necessary but not sufficient. What makes them valuable is honest application: actually following the analysis wherever it leads, even when it contradicts what you want to believe.

I can't force that honesty. I can provide tools, demonstrate rigorous application, create space for genuine inquiry. But whether you use those tools honestly or as sophisticated rationalization tools, that's up to you.

The difference between rationalization and rationality isn't intelligence or education. It's the willingness to be wrong, the courage to examine your own reasoning, and the honesty to admit when you're defending rather than discovering.

That's not easy. It's not natural. But it's the only path to actually thinking clearly rather than just thinking you're thinking clearly.


Related in This Series:
Understanding the difference between rationalization and rationality is crucial to applying the frameworks from How to Think vs What to Think: Why I Focus on Frameworks, Not Conclusions. See how these challenges shape my approach in Why I Try to Stay Non-Partisan (And Why That's Harder Than It Sounds).

Subscribe to What Interests You

Choose what you want to follow. No spam, just updates on what you care about.

📊 Systems Thinking

Non-fiction book updates, incentive analysis, frameworks

💻 Conscious Tech

App development, technical philosophy, tools that respect users

🎨 Story & Sound

Fiction chapters, music releases, creative process

🔥 The Whole Journey

Everything—weekly updates across all projects

Email subscription coming soon. For now, follow on: