In our age of information, knowledge is more accessible than it has ever been in human history. But not all sources keep to the same veracity. To the skeptic, it may seem easy to discern between truth and fiction, but anyone can be mislead. Too often we find loved ones, friends, and perhaps even ourselves believing false information, stubbornly holding fast to claims that seem beyond reason.
But why are we so profoundly susceptible to believing falsehoods? To understand the answer, I dove into the world of science. Here’s what I discovered:
To understand the origins of conspiracy theories, I turned to biology for answers.
It turns out that our bodies are simply imperfect. If you’ve ever torn your ACL, had hiccups, or had a bad back, you’re already familiar with this concept. ACL tears happen because their design doesn't work well for the strain we need to put on them. Hiccups occur when your nervous system accidentally bothers your diaphragm. Back pain is one of the most common medical problems because the spine is shaped in a way that puts too much pressure on the lower back!
We also have a wide variety of cognitive imperfections. It’s the reason we can see optical illusions, why we often reject logic because of our emotions, and it’s why we can fall into believing false information.
With our collection of brain bugs, there’s a common pattern: they serve a purpose but also have an unfortunate tendency to get activated in the wrong situations. According to biology, our instincts don’t always act in our best interest.
Our species is pretty darn good at drawing conclusions from our experiences. We can create and find patterns with ease, and we even enjoy seeing and hearing them! One pattern we have expertise with is the causality pattern.
The causality pattern is handy for survival in our wild world. When we see actions that have consequences, we store the pattern and use it as a guideline. If you’ve gotten food poisoning, you’ve experienced a great example of this. You might have had, or still have, an aversion to whatever food caused that terrible day or two of stomach pain. This skill we have is instrumental. It’s our brain’s way of telling us, “Hey, don’t do that! When you did that, bad stuff happened!”
The trait doesn’t always quite work in the right situation, though. Sometimes, our brain shifts into causality hyperdrive and sees patterns when there aren’t any. That's why superstitions like breaking mirrors and knocking on wood emerge. The pattern overdrive can also cause people to create and accept patterns that are entirely false.
Illusory Mind Reading
I’d bet that you think you’re great at telling others’ intentions. At times, this motive detection skill feels like a super power. It gives us an advantage when dealing with others. We can often tell when someone is hiding ulterior motives! It also helps in moments of danger. When someone is behaving aggressively, our "up-to-something" instinct kicks in.
This skill isn’t always accurate, though. For example, you might have thought that someone was mad at you because they were acting strange, only to find out later it had nothing to do with you! You might have been sure and then shocked that it wasn’t that sorta-rude thing you said 20 minutes before. We tend to overestimate this skill, and that overestimation increases when we are correct previously.
Like many other species on this planet, we’re hardwired to continually look for and prevent threats. It’s the reason we feel scared every once in a while. We have the instinct to want to survive and, when situations put survival into question, our brains try to nudge us away from them. If you’ve ever felt a tinge of worry when you’re in an unfamiliar dark room, alone on a hiking trail, or speaking in front of a crowd, you know this feeling well.
This skill often saves us, but it can also trigger in unnecessary situations. It’s a symptom of the most common mental illness in the United States: Anxiety. Instead of only firing in life-or-death cases, it might fire when there’s a potential for discomfort or even when the outcome is not easily predictable. This glitch, combined with others, can lead people to feel they are in danger - even when the stimuli around them are wholly benign.
Creatures of Category
The “Us Versus Them” mentality is pervasive in our societies. We instinctively like to categorize people, places, and things in neat little boxes. It makes complexity much easier to understand. Because of our love of abstraction, we’ve gotten quite good at understanding the difference between our in-group and the out-group. It can be useful to do this; it’s more likely that you’d find help or acceptance from someone that agrees with you or empathizes with your experience.
This mentality can, and often does, end up leaning to the extreme. It’s the reason why citizens of one country might think less of another country, one religion might hate another, and one political party might think the other side is full of morons.
When this glitch goes too far, it results in people thinking, “Anyone who isn’t in my group is actively acting against my best interests” or “Anyone in that particular out-group must be scheming together”.
To discover the appeal of conspiracy theories, I turned to psychology.
In this field, they have a name for brain glitches: cognitive biases. The list of these biases is expansive, and many are directly correlated with accepting false information! According to psychology, our brains are bad at finding unbiased facts and even sometimes prefer falsehoods.
Fear of the unknown is common. It’s why some people are afraid of the dark, why others don’t trust people outside of their towns or countries, and why others hesitate to try new things! When some new terrifying unknown enters our experience, especially in times of crisis, we panic that our epistemic needs are not met. The solution that many people employ for this situation is a strategy I like to call “the info-snatch”. Essentially, under duress, we take the gap of unknowns and shove in whatever information we can find to make it whole again.
When we’re panicked, our biases shift into lightspeed. We info-snatch whatever content fits our worldview and relax our sense of logical discretion. Plus, once this period is over, our brains end up stuck with that maybe-questionable information.
We like to feel like we’re in control. When something threatens that, and suddenly we find more rules or feel like a problem is beyond our abilities, we push back. Our brains prefer that we feel existentially safe than acknowledge the earth-shattering lack of autonomy we are subject to.
A typical result of this bias is mentally reducing or even considering the threat nonexistent. This reduction might ring a bell for you. In moments where social control is diminished, the effect can look even more intense. When people feel they have lost their position or power, their bias can lead them to wholly reject narratives that they dislike, even when they are irrefutable. This phenomenon is probably also familiar to you.
We’re a highly social species. There’s a reason belongingness is a level in Maslow’s Hierarchy of Needs! When we feel we finally belong somewhere, we look towards the next level: esteem. The niche we fit into defines us, and for that reason, we deeply desire that our group brings about a sense of prestige to us. We like to feel like we’re better than, it’s comforting to believe that we’re better than something or someone. When our tribe is in disagreement with some other group, our brains glitch out and make assumptions. We think, “Well, my group is better! The other group must be wrong”. Even when the in-group's claims are ridiculous to the average outsider, our brains will hold fast to keep our psychological needs intact.
We can see a similar bias occur when someone experiences ostracism. If you’ve been excluded or reprimanded before, you’ve probably thought that you’re better, cooler, or smarter than that group that wouldn’t let you belong. It’s a defense mechanism to save your sense of self-esteem. When someone is expelled for their beliefs, it can often harden those beliefs, even if they’re baseless.
The Bandwagon Effect
It’s nearly impossible to be an expert about everything. For that reason, many turn to friends, respected experts, and in-group dogma to fill in the gaps of their understanding. It’s useful to do! Most of us don’t have the time to learn every bit of knowledge out there.
This strategy, though usually helpful, can become a bias. A deep respect for a person or group can cause us to put our guards down when they make new claims.
In a similar vein, we tend to put too much value in anecdotes. If our experience contrasts the statistics, it’s natural to throw out the data first because we trust ourselves more than some random study! This bias is heightened when our groups’ experiences are also in contrast with the data. It’s easier to believe that a number is wrong instead of believing that you or your favorite people are outliers.
We also are predisposed to copy the people around us. Just like seeing someone yawn can make you yawn, seeing others in emotional states will cause your brain to empathize. That’s why we cry during sad movies or laugh when we see someone else laugh. If you’re an empath, this affects you even more than the average bear.
We copy strategies and reasoning in the same way. We subconsciously mimic people we see as mentors and advisers, even when it might not be the best way to do something! Our brains are subject to monkey see monkey do mentality.
Veritas Liberabit Vos
Misinformation is a tricky challenge to triumph over. Even with the utmost diligence, our brains can still end up sabotaging us. So, how can we fight against false information? The first step is to acknowledge our glitches. Only when we are self-aware of our imperfections can we start to determine what we’ve gotten wrong.
“All truths are easy to understand once they are discovered; the point is to discover them.”
Interested in learning more? Here are my sources & inspirations:
Human Glitches & Biology
- Human Errors by Nathan H. Lents
- Scientific Articles by Nathan H. Lents
- Your Inner Fish by Neil Shubin
- Conspiracy Theories: Evolved Functions & Psychological Mechanisms
Psychology & Memetics
- Scientific Articles by Karen M. Douglas
- The Selfish Gene by Richard Dawkins
- Lecture by Laurie Santos: Cognitive Glitches
- Why People Believe Weird Things by Michael Shermer