Some thoughts on reason, falsehood, and emotional need

When David and I arrived at work last Wednesday, our HR manager was in a pretty foul mood. When David asked how she was, she answered “scared. We’ve just voted in a Muslim terrorist as President.”

Now, Barack Obama is neither a terrorist nor a Muslim; in point of fact, he’s a Christian and a long-time member of a Christian church affiliated with the United Church of Christ. But that’s not really what I came here to talk about; in fact, I’m not really here to talk politics at all. I’m here to talk about what makes people believe outlandish things.

There’s a really interesting two-part essay over on Slactivist about an enduring urban legend surrounding Proctor & Gamble, the company that makes laundry detergent and soap and whatnot. According to the urban legend, an unnamed officer of Proctor & Gamble appeared on some television talk show some years ago and announced that the company donates part of its profits every year to the Church of Satan.

As with all urban legends, the details are fuzzy and change over time. Sometimes, it was the president of the company; in other tellings, it was the CEO. Sometimes it was Oprah; sometimes, Phil Donahue. The name of the person who appeared on the show and the date the show aired are, of course, never given.

The interesting thing about this urban legend is its total absurdity. It’s trivial to disprove; it can be demonstrated conclusively beyond even a single atom of doubt that it just plain never happened. Moreover, its utter absurdity would seem to suggest that no reasonable person could believe it.

The two-part essay is worth reading; you can find part one here and part two here.

The essay asserts that, in a nutshell, the folks who repeat this tale, which has been circulating since at least 1980 and possibly before, don’t believe it’s true; instead, they willingly pass on a story they know to be false, and only pretend to believe it’s true. The author asserts:

Those spreading this rumor can be divided into two categories: Those who know it to be false, but spread it anyway, and those who suspect it might be false, but spread it anyway. The latter may be dupes, but they are not innocent. We might think of them as complicit dupes. The former group, the deliberate liars, are making an explicit choice to spread what they know to be lies. The complicit dupes are making a subtler choice — choosing to ignore their suspicion that this story just doesn’t add up and then choosing to pass it along anyway because confirming that it’s not true would be somehow disappointing and would prevent them from passing it along without explicitly becoming deliberate liars, which would make them uncomfortable.

What I want to explore here is why anyone would make either of those choices. In both cases, the spreading of this rumor seems less an attempt to deceive others than a kind of invitation to participate in deception. The enduring popularity of this rumor shows that many people see this invitation as something attractive and choose to accept it, so I also want to explore why anyone would choose to do that.”

I think this is a very interesting argument–that those who pass on the story know it to be false, since it would seem that the story is so prima facie ridiculous that nobody could really believe it.

But I don’t think that’s the case. I don’t believe that the people who pass on this story know it to be false, but pass it on anyway, Instead, I think the real answer can be found in a comment posted after the end of the first part of the essay, and I’ve been chewing on it for weeks now. It offers, I think, a very useful insight into fallacy of all sorts.

The important bit, which caused something of an epiphany in my own understanding of the human condition, is this bit:

When an untrue story circulates, it’s generally because it expresses some kind of social unease. There may not be razors in the Halloween apples, but it’s a way of expressing the concern that your precious children are going out knocking on the doors of people who may not wish them well. There may not be rat poison in the Mars bars, but it’s a way of expressing the sense that they’re definitely not good for you. Not every Bridezilla story may be true, but it’s a way of expressing the sense that the wedding industry is too high-pressured and perfectionistic. There may not be Satanic abuse going on at day care centres, but it’s a way of expressing a sense of discomfort at women going to work and leaving their children in the care of others. And so on.

Human beings are inherently irrational. We carry around with us a kind of internal model of the way we make decisions: we are posed with a question, we think about the question, we evaluate evidence to support or refute each of the available options, and then we come to a conclusion. But that isn’t how it works at all.

More often, the decision is made emotionally, on a subconscious level, long before we ever start thinking about it. After the decision has been made deep within the bowels of our emotional lizard brains, our higher-order, monkey-brain reason is invoked–not to evaluate the decision, but to justify it.

One consequence of this emotion-first decision-making process is confirmation bias, a process of selective evaluation where we tend to exaggerate the value of anything which seems to support our ideas, and to devalue or discard anything which contradicts our ideas. It’s a powerful process that ends up making the decisions we’ve already made and the things we already believe all but immune to the light of disproof, no matter how compelling the disproof may be.

I’ve written before about how the brain is really not an organ of thought so much as an apparatus for forming beliefs, and how it has been shaped by adaptive pressure to b remarkably resilient at forming, and holding on to, beliefs about the physical world.

The adaptive pressures that gave rise to the belief engine within our heads don’t necessarily select in favor of organisms that generate correct beliefs, for reasons I talked about there. But the beliefs that we form do serve a purpose, and sometimes, it’s an emotional purpose.

The things a person believes can reveal a lot about that person’s underlying emotional processes. Beliefs often reflect, in a garbled and twisted way, the buried perceptions and the emotional landscape of those who profess them. Even the most outrageous, clearly absurd beliefs can be quite sincere, and otherwise sane, rational people will adopt insane, irrational beliefs if those beliefs serve some emotional function.

Looked at in this way, a lot of patently absurd beliefs begin to make a kind of sense. They’re distorted funhouse mirror projections of an underlying emotion, twisted out of all rational shape, and clung to through a powerful set of mental processes that make them seem attractive, even obvious.

The idea that Obama is a Muslim terrorist, ridiculous as it is, is an expression of an emotional state: “I do not identify with this person, he seems alien and strange to me, and I am afraid that he will not make decisions that reflect my needs.”

The notion, sometimes seen in a few of the more extremist corners of radical feminism, that all heterosexual sex is always rape becomes an expression of an emotional state: “When I have had sex, I have felt disempowered and violated by the experience.”

The idea that the government staged the attacks on the World Trade Center is a twisted-up, garbled expression of an emotional state: “I am afraid that my nation’s government is corrupt and evil, and is willing to resort to any means, however extreme, to achieve its own ends.”

This is why these beliefs are so vigorously resistant to debunking, even when the evidence against them is overwhelming. They are not assertions of fact in the way that many other statements are; they are assertions of emotional identity.

And they can not be treated as assertions of fact, even though on their face that’s what they look like.

If someone says “New York city is the capital of New York State,” that’s an assertion of fact. It’s easily countered; you can easily show him a map, or point him to Wikipedia, and say “No, the capital of New York State is Albany.” And, if he’s not mentally ill in some way, he’ll probably say “Really? I didn’t know that. Cool!”

But if someone asserts that Proctor & Gamble donates money to the Church of Satan, and you contradict him, he’s likely to respond with anger–in a way that he won’t if you tell him that Albany is New York’s capital. That’s because you’re not contradicting his assertion of fact; you’re telling him that his emotional identity is wrong.

And it’s important to understand that even if a particular belief is wrong, the emotional landscape supporting that belief might not be. Proctor & Gamble does not donate money to the Church of Satan, but what is that belief an expression of? One likely possibility is that the emotional state beneath it looks something like “I do not trust large, faceless corporations to have my interests at heart, and I am afraid that a society dominated by large, faceless corporations may not be responsive to my needs and my values.”

And you know what? That is a perfectly reasonable feeling to have. There very well indeed may be some truth to that idea, even though the specific beliefs that grow from this soil are twisted and misshapen.

Any attempt to debunk these ideas will never succeed if the debunking does not separate the idea from its emotional foundation. Furthermore, the fact that an idea grows from and is nourished by some kind of underlying emotional reality means that even the most otherwise skeptical, rational person can become attached to ideas that are patently false, and that person’s own tools of rational skepticism may not be able to evaluate, or even see, those ideas.

The challenges this notion poses to skeptics and rationalists is worthy of a post in its own right, and will be the subject of Part 2 of this essay.