"Even if they're right, they're wrong"
I use this expression a lot. I think I got it from Penn & Teller’s show Bullshit! specifically the episode on anti-vaccine activists. It’s a fun mental tool that is very useful and I’ve found most people don’t have. The basic tool goes something like:
- You hear some really bad advice based on bad information
- Now instead of just dismissing it, think of how the world would look if the underlying bad information was actually true
- Find that the bad information, in fact, isn’t even close to warrant the bad advice Let’s take some concrete examples:
Vaccines cause autism?
Obviously they don’t because we have the huge studies showing they don’t. But let’s assume for a moment that there is a link as claimed by disgraced former doctor Wakefield. Should you skip vaccines for your child? Well no. Because the risk suggested by anti-vaccine activists is too small to be taken seriously. Vaccines protects against death, brain damage, blindness, etc to a degree orders of magnitude bigger than the proposed risk. So not vaccinating your child is to take away a small risk (again: that doesn’t exist) and adding a risk a hundred or thousand times bigger. It’s just a bad bet, like putting your life savings on green in roulette. Even if they’re right, they’re wrong.
Chemtrails are a conspiracy to depopulate the planet?
Chemtrail conspiracy theories are funny in that proponents of them march together but have wildly different ideas that directly contradict each other. It’s like seeing ISIS and the Catholic Church march together because they believe in God. It makes sense only when you don’t think about it for a microsecond.
But let’s go beyond that and focus on one version of the conspiracy theory in particular: that chemtrails are to depopulate the planet. They say chemtrails have been going on since the 50s but the population of Earth has risen dramatically. So let’s assume the hypothesis is correct, what is the conclusion? It has to be that there’s a huge global conspiracy that is blatantly failing to get anything done! Maybe we shouldn’t worry about it so much then. They are wrong even if they are right.
Mobile phones cause cancer?
The risks proposed for mobile phone cancer are tiny. And I mean crazy tiny. Crossing a road once in your life tiny. So even if the risk is real you shouldn’t act on it at all. Having a mobile phone on your person and on at all times so you can quickly call 911 lowers your risk of dying more than the proposed cancer risk raises it. The debate is already lost because they are wrong even if they are right.
The House version
The character Gregory House from the TV show House uses a similar logic. When diagnosing a patient he directly discards all hypothesis where the patient would die. Why? Because it doesn’t matter if a hypothesis where the patient dies is correct or not: it’s the same as if no treatment was given. It’s similar in that even if right, it’s wrong (or in this case useless).
As a programmer I’ve often used this kind of thinking to great effect. If a bug looks like it’s some problem you can’t do anything about, just discard that hypothesis without thinking about it a second more. We only need to care about a hypothesis that can give us an avenue of attack on the problem.
Next time you hear what you think is bad advice, try to see how deep the rabbit hole goes. How flawed is the reasoning? If it’s only one level deep, i.e. it makes sense given the proposed data, maybe look into the data and check that it’s really incorrect. You might learn something!
But if it’s logic is flawed to the core, point that out. Maybe someone on the fence will hear the crazy for the first time and change their mind.