It was at the end of a row of bookshelves in the basement of the library where I found a decrepit book with a decaying cover.
In this book I read an account of a distant and inaccessible land, called Erewhon, a place that no one has ever been except the author of the narrative.
According to the author, the people there are peculiar and have many unusual traditions, but one in particular stands out from their history.
At one time, the entire nation was completely convinced by the arguments of a great philosopher, by which he proved that it was morally unacceptable to eat meat. But not only that, he also proved that it was unacceptable to eat vegetables.
His reasoning went like this: since now we recognize that the barbaric habit of eating other humans is inappropriate, we likewise do the same with other animals. Animals are clearly alive and display intelligence, therefore it is wrong to eat them.
Furthermore, for the same reason, it is inappropriate to eat plants of any kind, since they are also alive and display a kind of intelligence. You see, plants have adapted to their environment, and have developed various creative mechanisms to continue to live and reproduce. Therefore they are intelligent. Therefore it is wrong to eat them.
The people of Erewhon were altogether convinced by this impregnable logic, and as a result they universally agreed to abide by these new principles.
The author of the history, seeing as we do the absurdity of this situation, wrote that "young people were told that it was a sin to do what their fathers had done unhurt for centuries." Even though the Erewhonians had eaten both meat and vegetables as long as anyone could remember, they quickly gave up this practice due to the enlightened reasoning of the philosopher.
Unfortunately, they quickly became hungry.
At first, the citizens tried to find loopholes to get around the principles. For example, they agreed that it was OK to eat an animal or a plant that was already dead. Oddly, it was soon found that many of the animals allegedly committed suicide, in numbers previously unheard of.
Still, the Erewhonians were hungry. So another new technicality was developed: it was OK to kill an animal or a plant in self-defense. As you might imagine, there was a proliferation of incidents of unusually aggressive cows and chickens, and even stories of violent corn and tomatoes.
But they were still starving. Finally, at the brink of collapse, the people of Erewhon could take it no longer. They caved on the entire set of principles, realizing that their strict adherence to logic had betrayed them. They needed meat, or at the very least vegetables, in order to survive. Mind over matter could only take them so far—their bodies simply required calories. In hindsight, they regretted that they had been so contemptuous of the customs of their ancestors, and what their bodies instinctively told them was necessary.
As I mentioned before, this was not the only incident of such unusual events in the history of the Erewhonians. I read of many more such mishaps during my perusal of the old book. I believe the author summarized them aptly in this closing quotation:
"The Erewhonians are a meek and long-suffering people, easily led by the nose, and quick to offer up common sense at the shrine of logic, when a philosopher arises among them...
Indeed I can see no hope for the Erewhonians till they have got to understand that reason uncorrected by instinct is as bad as instinct uncorrected by reason."
How it all came about
Believe it or not, we do the same thing—we consistently fail to properly balance instinct and reason, which causes us to make all kinds of mistakes, both minor and massive.
But first we have to understand the relationship between instinct and reason, and how it all came about:
For most of human history, we relied almost exclusively on instinct and convention. Although the Greek philosophers developed rudimentary systems of reasoning and logic around 400 BC, those methods didn't truly take hold until the scientific revolution of the 1500s.
The rest of the time, whatever our bodies told us to do, and whatever we saw the people around us doing, we followed.
For example, if members of our tribe ate blue berries and not red berries, then it was probably safe for us to do so as well. Mimicry—not creativity—was the key to survival. Only when forced to deviate from the norm, e.g. when all the blue berries were frozen, did we experiment.
And the body automatically reinforced whatever behaviors we saw among others. Grilled meat, as you know, has an extremely appealing smell (unless your body has developed an intolerance for it). Likewise, rotting meat is extremely repulsive to us; and also spoiled milk and human waste and rot, mold, mildew, and most other things that are harmful to us. As they say, "the nose knows."
So the two worked in tandem. Our body instinctively desires sex, but our cultural conventions required that we be married first. Our body desires to greedily feed itself, but our conventions required that we share the food with elders and children first. Our body is terrified during the night and wants to run and hide, but our conventions required that we do the honorable thing and guard our camp.
Where the body was ignorant, convention provided the answer. It was the solution to all kinds of questions: what kind of food to eat, and where to find it, and where to build a home, and how, and who to marry, and when. It also helped us explain the patterns of the weather, and the movements of the stars, and why people get sick, and what happens after they die, and so on.
Some of these answers were incredibly helpful, but others were just plain wrong. For example, we now know that people get sick not because they have an imbalance between their phlegmatic and melancholic humors. And we now know that the earth is not the center of the universe, and that mental illness is not the same as demonic possession.
Nevertheless, reliance on instinct and convention worked pretty well for us, for a very long time.
But it wasn't until we started questioning things that we truly began to flourish. With the advent of the scientific method in the 1500s, all kinds of technical disciplines developed, from agriculture to economics to politics and engineering and psychology. And humanity exploded into productivity.
Good servant, bad master
But like every other aspect of life, too much of a good thing can become a bad thing. Reason can go too far.
The most obvious example of this imbalance is the story of the Erewhonians above, when their logic led them to conclusions that were so out of alignment with their instincts and convention that they were nearly devastated.
But we see it in our culture too. We see it in corrupt politicians and executives and other people in power, who have used rationalizations and justifications to land themselves in morally depraved situations. We see it in unsafe relationships (whether a first date or a marriage), when people have tried to convince themselves that everything is fine although their guts tell them otherwise. We see it in the mass rejection of long-standing traditions and institutions, in favor of the promotion of independence and autonomy for the individual, which has only led to widespread disconnection, confusion, isolation, loneliness, depression, and despair.
All of these instances, and many more, are the result of reason uncorrected by instinct, the exact mistake made by the Erewhonians.
And I am not the only one to comment on this trend. In the past few months, my good friend Taylor Foreman (who is a phenomenal writer), wrote two essays about this very subject.
But it was the latter of the two that piqued my interest, in which Taylor argued that cognitive biases are bunk, so we ought to ignore reason and trust our instincts. While I agree with his sentiment, especially in light of the issues addressed above, I think it's also unnecessary to throw the baby out with the bathwater. Reason is still useful. Otherwise we risk regression back to those stone age times when thought hysteria was caused by the woman's womb.
So, the question arises: how do we balance instinct and logic, mimicry and creativity, tradition and progress?
Cognitive biases are not the problem, but rather the solution. The key is a proper understanding of how to use them as a mechanism through which we apply logic to our instincts. Allow me to explain.
Classic mix-up
Cognitive biases are the common "irrational" mistakes that humans are prone to make. They have been demonstrated time and again, and are so recurrent that we can predict when they will happen.
In short, these are perfect illustrations of when our instincts fail us. And we used logic to find them and name them.
Some are merely fun, while others are fundamental.
For example, "The Cocktail Party Effect," describes how, in a loud room with lots of people talking, you are more likely to notice when someone says your name, even if you are deeply engaged in another conversation. Somehow, our brains automatically tune out all the other words as "noise," but immediately recognize those precious sounds that signal someone is talking about us.
On the surface, this isn't really that practical, so it's easy to discard as pointless trivia, a simple quirk of human nature. But it becomes interesting once you realize its implications:
That we are perceiving from all of our senses all the time, even if we cannot fully register the information in consciousness.
That something inside of us determines which information is relevant or not, filtering out most of it, without us even knowing it.
Therefore, "The Cocktail Party Effect" reveals that we are prone to missing important information while mistakenly focusing on what pleases our egos. This has all sorts of implications for how we make decisions.
And here's another one: "The Strawman" describes how, in a debate or discussion, we are more likely to attack the weak link in another's argument, even if it's not really relevant to their main point. For example, in his essay, Taylor focuses on examples of cognitive biases like "The Ebbinghaus Illusion" that are merely fun, and not really practical for our everyday lives. As a result, Taylor concludes that all cognitive biases are pointless. But that's missing the point. That's a Strawman argument.
Moreover, Taylor makes another common mistake: "Confirmation Bias," which describes how we are more likely to seek out and remember information that confirms our existing beliefs, and ignore the information that does not. In analyzing the cognitive biases as a topic, Taylor focuses only on one book, Thinking Fast and Slow, and seems to draw conclusions based on a superficial reading of the book (though he claims to have read it multiple times).
He writes, "The book suggests that because ... your intuition makes all these errors ... it shouldn't be trusted." He also writes, "[The author] laments this in the book: most people can’t overcome biases." Therefore, there’s nothing we can do about it.
But those are not at all the conclusions from the book. The author merely makes the point that our intuitions and instincts can lead us into mistakes. Not that our intuitions should never be trusted, but rather that they should not always be trusted. The distinction is subtle, but important.
In the actual conclusion of the book, the author explains that the solution to avoiding cognitive biases, and in fact that the whole point of the book, is simply to understand them and recognize them by name.1 If you are aware of the Strawman argument, and you notice yourself doing it, that's all it takes to correct it.
After reading Taylor's essay, I tried to think of some examples of cognitive biases that were relevant and practical. I was surprised to find that I was able to list over 40 examples of cognitive biases that I use every single day. Far from being just mere trivia, these principles have drastically transformed my interactions with the world and other people. And most importantly, I believe they are the key to properly balancing logic and instinct.
I'm not going to share all of them here, but I plan to in another essay. For now, I'll just explain the biggest and most important one. If you take nothing else away from this summary of cognitive biases, take this one. It can change your life.
The Biggest Bias
It's called "The Fundamental Attribution Error." It is so named because it is fundamental to our behavior, and in fact "forms the conceptual bedrock for the field of social psychology."2
Here's how it works.
When I am driving, and I make a mistake, it's never my fault. This could be for any number of reasons.
I ran a red light-- it's because I'm late! Or that yellow light was too short!
I waited too long after the green light-- I have to answer this text, it’s important!
I cut someone off -- they wouldn't let me in!
And so on. However, when someone else makes a mistake, it's because they are a terrible person. They are either dumb, or selfish, or just pure evil.
They ran a red light-- what a selfish jerk! People like that are dangerous.
They waited too long after the green light-- get off your stupid phone!
They cut me off-- what an asshole!
Can you relate?
In short, "The Fundamental Attribution Error" is this— when we make an error, we attribute the cause of the error to forces outside our control. When others make an error, we attribute it to a character defect. We fail to give them the same mercy and benefit of the doubt that we give to ourselves.
This doesn't just happen in traffic, but also in more intimate relationships—with our coworkers (peers, bosses, and subordinates), our families, partners, friends, and everyone in between. We are always innocent, and they are always at fault. And what's even more insidious is that we don't even realize we are doing it, because it is instinctual. Can you imagine how poisonous this can be to our relationships?
But all is not lost. Our logic has enabled us to identify and recognize this error in our instincts. Once we identify it is happening, we can correct it.
To combat this bias, whenever we make a mistake, we admit the ways in which we contributed—we are selfish, we are hasty, we are entitled. And whenever others make a mistake, we consider the ways in which outside forces probably had an influence—perhaps they are late, or they are having a terrible day and everything is going wrong, or maybe they just didn't notice what happened. It's ok. Not everyone in this world is evil and stupid. In fact, most are not.
Conclusion
So finally, what is the solution to balancing reason and instinct?
In fact, Taylor is on the right path, because he writes, "Trusting [your intuitions] is your best bet."
At the end of the day, our instincts are remarkably reliable. They have been honed for hundreds of thousands of years of successful living. And the conventions which lie on top of them are also excellent guides, for they have been honed for thousands of years of civilization and culture. The two work in tandem to correct one another.
Are they perfect all the time? No.
But our cognitive biases reveal the way they consistently fail. When we use our logic in this way to reprove instinct, we get the best outcomes. This is how you “check yourself before you wreck yourself.”
Logic can also be used in other ways to correct instinct and convention, by developing science-based disciplines that reveal heretofore undiscovered truths. But these new findings are rarely the solution to all of our problems. At best, they are incremental adjustments. And often, logic and reasoning can go too far and lead us astray from our instincts. So we can still use them, but we should use great caution and hesitancy whenever they seem to contradict instinct and convention.
For example, consider two people who are trying to be healthy, John and Sarah. John relies on the science of nutrition, which tells him that the best way to be healthy is to eat a very specific diet in a very specific regimen. This requires a great deal of time and effort and sacrifice, and usually makes John an unpopular dinner guest, so he just prefers to eat all of his meals at home, which he has carefully weighed and measured, scientifically calculating them to provide the optimal nutrients.
Then there’s Sarah, who relies on her family tradition to always eat her meals other people; with her friends and family and coworkers. At every meal, she engages in conversation, shares about her day, listens to the others, and experiences love and laughter and life. Sarah still pays attention to what she eats, but isn’t obsessed about it like John. Her meals aren’t as efficient, but that’s not really the point either.
In the long run, who do you expect would be the healthiest?
What does your instinct tell you?
Footnotes:
From the conclusion of Thinking Fast and Slow:
“What can be done about biases? How can we improve judgments and decisions, both our own and those of the institutions that we serve and that serve us? The short answer is that little can be achieved without a considerable investment of effort.”
It seems that there’s nothing we can do… on the surface. But a little later he says:
“The way to block errors that originate [from intuition] is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement.”
So, stop and recognize. And lastly:
“Ultimately, a richer language is essential to the skill of constructive criticism. Much like medicine, the identification of judgment errors is a diagnostic task, which requires a precise vocabulary. The name of a disease is a hook to which all that is known about the disease is attached, including vulnerabilities, environmental factors, symptoms, prognosis, and care. Similarly, labels such as “anchoring effects,” “narrow framing,” or “excessive coherence” bring together in memory everything we know about a bias, its causes, its effects, and what can be done about it.”
And so his conclusion is for decision-makers to frame their actions in the context of being evaluated by an informed group of gossipers—those who recognize cognitive biases by name. This helps the decision-maker enter the mindset of an outside observer, to better see his own biases.
In recent years, there have been some doubts raised about the scientific validity of the experiments that proved The Fundamental Attribution Error. But I think you'll agree that it rings so true that it's impossible to doubt. We’ve all experienced this in our lives.
I would never say not to use reason! Rather, it’s a good servant and a terrible master (how you said it in the fascinating opening). And I wouldn’t conflate reason with mere behavioral economics.
Focusing on overcoming cognitive biases doesn’t work bc it presupposes that there is an underlying optimal path. There isn’t -- it is ever unfolding and discovered via a interplay between intuition and reason. “Trust your cognitive biases” is a tongue in cheek way of saying “don’t be a dork and get living”
Also, most cognitive biases are actually good for you (confirmation bias) in the context of your entire life, and making your life about trying to undo that error will fuck you up far more than if you just developed a deeper connection with your intuition.
Basically, behavioral economics is mostly midwit nonsense. I won’t apologize for that! Haha.
Thanks for the response man. Fun stuff.
Great essay! Much of my work with trauma victims is around teaching people to listen and trust their instinct, Aka Holy Spirit, Women’s intuition. Too many times their instincts tell them to believe and do something that is too frightening so they silence them. The result is living with years of cognitive distortions that do not serve them well in the long run but may indeed keep them safe in the short term. I believe it’s imperative to surround ourselves with wise and trustworthy people who can help us navigate this dilemma.