Will you bite the hand that feeds?
Will you chew until it bleeds?
Can you get up off your knees?
Are you brave enough to see?
Do you want to change it?
What is the purpose of the human brain? What function does it serve? Be careful; this is a trick question!
If you say “The brain is an organ of thought” or “The brain is an instrument of knowledge” or “The brain is the way we understand the world,” that’s the wrong answer. The correct answer is that the brain is an organ of survival. We have these big brains because they enabled our ancestors to survive; in that sense, they are no different from claws or fur or fangs.
And like all organs of survival, the brain was shaped by natural selection, sculpted by evolutionary pressures that favored the traits that helped our ancestors survive. The big brains we have now were molded and shaped to one purpose: to help small bands of hunter-gatherers survive.
Back in the day, when we rarely lived longer than 20 or 25 years and starvation battled with predation by other large carnivores for the number one spot in “things that killed human beings,” our brains gave us a competitive advantage. They did this in part by acting as engines of belief, allowing us to form models of the world and create beliefs about the world that gave us an advantage.
For example, an early human who observed that if he was upwind of his prey, the prey got away, but if he was downwind of his prey, he could more easily kill it formed a belief: “Staying downwind from the prey makes it more likely that the prey will not escape.”
Of course, other animals know these things instinctively. But the advantage of our big monkey brains is that we do not have to rely on instinct; we can form beliefs on the fly, as we go along, which means we can function in environments our instincts are not prepared to deal with. The brain as an organ of survival allows us to make observations and draw beliefs from these observations, and these beliefs give us a competitive advantage.
These beliefs can be immediate and concrete, such as “If I stick my hand in the fire, it will hurt.” They can make predictions about the future, such as “The sun will rise tomorrow” or “If the days grow longer and the weather grows colder, then winter is coming, and food is about to become less plentiful.” A belief can be negative, such as “If I leap from the top of this tree, I will not be able to fly.”
Having a brain optimized for forming beliefs is important if forming beliefs your survival schtick. If you think of the brain as a belief engine, which can either believe something or disbelieve it, and if you think of a particular belief as being true or false, it is easy to construct a game theory matrix describing all the possibilities, with two success modes and two failure modes:
Ideally, our brains lead us to believe things that are true, such as “A large leopard is a dangerous adversary,” and to disbelieve things that are not true, such as “I can eat rocks.” But there are two failure conditions as well: rejecting beliefs that are true, and accepting beliefs that are not.
The failure conditions have survival implications. Believing untrue things and not believing true things can both lead to disaster.
Of the two, though, believing untrue things will, in a small group of hunter-gatherers, usually cause fewer problems than not believing true things. Believing that dancing in circles three times and carrying a magic stone around with you will increase the chances of a successful hunt doesn’t really hurt anything; not believing that staying downwind from your prey is important has a significant survival penalty attached to it.
There’s a strong survival imperative, in other words, to prefer failure by believing something untrue over failure by not believing something that is true. Believing is less expensive than not believing. If a primitive hunter-gatherer eats an unfamiliar food, then becomes sick, it might not be the food that caused him to get sick–but if he believes the food makes him sick, and he’s wrong, the consequences are not too great, whereas if he does not believe the food made him sick,a nd he’s wrong, the consequences can be deadly. The guy who ate some food, got sick, and believed the food made him sick is the guy who survived; today, his descendants give their kids a measles vaccination, and when coincidentally their kids are diagnosed with autism, believe that the measles vaccination caused the autism.
From a survival standpoint, the consequences of not believing something true are worse than the consequences of believing something that is not true. Natural selection, therefore, tends to select in favor of people whose default state is to believe something rather than in favor of people whose default state is to disbelieve something.
And to confound matters further, humans are social animals. In our earliest days, when our social groups tended to number fifty or a hundred people and leopards were a serious and ongoing threat, to live alone was a death sentence. We depended on the support of others to survive.
But that support had a price. Groups, like individuals, form beliefs. To reject the beliefs of your group was to risk ostracism and death. People who questioned and challenged the beliefs of their tribe often did not survive to pass on their genes to future generations; the ones that were most likely to pass along their genes were the ones who learned to believe what the group believed, even if it was contradicted by clear and available evidence.
And those who were adept at manipulating the belief engines of others–shamans, tribal rulers who convinced others of their divine right to rule–tended to be disproportionately successful at mating and tended to control a disproportionate amount of resources, meaning they tended to pass on their genes most successfully.
The greatest invention of the human mind is not fire, or agriculture, or iron, or the steam engine, or even the splitting of the atom. From the perspective of understanding the physical world, the greatest invention of the human mind is the scientific method–the systematic, skeptical approach to claims about the way the world works.
When a scientist has an idea, he does not believe it, and he does not seek to prove it. Instead, he approaches it skeptically, and he seeks to disprove it. The more the idea resists increasingly sophisticated and vigorous attempts to disprove it, the more faith he begins to put in it. This is why any idea that is not falsifiable is not science.
A correlary of this idea is the notion that physical reality behaves the same way everywhere, for everyone. If a brick falls when it is dropped in Kansas, it also falls when it is dropped in Salt Lake City–and, importantly, it falls no matter who drops it, whether the person who drops it believes that it will fall or not. The physical world does not change itself to conform to human wishes and expectations. A claim that is made about some process that must be believed in order to be seen, such as ESP, is not science.
But skepticism is not innate. It is learned. The human brain has been shaped by natural selection not to be skeptical. It has been shaped by evolutionary pressure into a belief engine that believes things more easily than it disbelieves things. For our ancestors, the penalty for skepticism was very high; those early hominids for whom skepticism came naturally did not live long enough to pass on their genes to us. Our brains evolved to be gullible, not skeptical.
Today, we live in a cognitive and physical environment very different from that of our ancestors. But the machinery of natural selection is slow.
In the modern world, the same four states of our belief engines still apply. We are still predisposed to believe things rather than disbelieve them; and we can still believe things that are true, disbelieve things that are true, believe things that aren’t true, or disbelieve things that aren’t true:
Believing things that are true
Believing things that are not true
Not believing things that are true
Not believing things that are untrue
What does this mean in practical terms? Simple. It means that your brain has been hard-wired over hundreds of thousands of years of natural selection to make you credulous. Look at the brain as an instrument of survival, look at natural selection creating pressures to prefer the failure mode of believing that which isn’t true over the failure mode of not believing that which is true, and you end up with people hard-wired from the ground up to be gullible.
Your brain is a tool of survival that works by acting as an engine for creating beliefs. When you form a belief, you get a little squirt of pleasure that lights up the reward circuit of your brain. You’re emotionally rewarded every time you believe something.
At the same time, skepticism, and rational, analytical thought, do not come naturally. They’re not what your brain was optimized for; because of that, they are skills which must be learned, and are not innate. In fact, they feel unnatural and uncomfortable to you. Your brain gives you a reward for accepting beliefs, not for challenging them.
There is good news, however. When you introduce sapience into the mix, things change. Biology is not destiny. Your brain is optimized to make you gullible, but you do not need to be. You can train yourself to recognize that little squirt of pleasure you get when you believe something for what it is–a biological holdover from a time when adopting beliefs quickly and without skepticism had survival advantage. You can train yourself to be skeptical, even though it’s not natural for you.
And the rewards for doing so are great. In a modern world, where people want you to believe that they will transfer THE SUM OF $25,000,000 (TWO HUNDRED FIFTY MILLION US$) into your bank account from Nigeria if you give them your bank account information, where emails tell you that you need to update your credit card information or PayPal will shut you down, where people tell you that viruses and bacteria don’t cause disease and if you just order magic “balancing powder” ($360 for a 6-month supply) from their Web site you’ll never get sick, credulity is a survival disadvantage, and skepticism an advantage.
But it doesn’t come naturally. You have to work at it.