People seem more divided than ever—and more confident in their beliefs. Psychologists have discovered an amazing antidote that seems capable of solving both problems: intellectual humility. But new work finds that intellectual humility might not always help and in fact, might even make some of our divides worse. The problem is not with humility itself, but with what happens to our minds when we see ourselves as intellectually humble. We call this problem "The Enlightenment Trap,” because when people feel like they epitomize the spirit of intellectual humility, they get stuck in self-conceited thought loops. These thought loops start with pride but eventually feed into an insidious form of prejudice. In this article, we’ll explore this phenomenon and outline some evidence-backed strategies to sidestep this all-too-common intellectual pitfall.
The Promise of Intellectual Humility
Historically, intellectual humility has featured prominently among philosophers and spiritual leaders. Socrates famously noted, “I neither know nor think that I know.” In a similar vein, the Buddha stated “The fool who knows he is a fool is that much wiser.” Not only do we find that the wise ancients modeled the spirit of intellectual humility, we also attribute modern progress to institutions organized around the principle of intellectual humility: In his book “Enlightenment Now,” Stephen Pinker emphasized that the recognition of the fallibility and limits of our perspective was a central feature of the 17th-century enlightenment revolution.
Psychologists recognize the promise of intellectual humility, and rightly so: people benefit from accepting that they could be wrong every once in a while, given how biased our thinking often is. We’re prone to a wide array of cognitive biases, including our inclination towards overconfidence. This tendency is associated with many undesirable behaviors, like falling for fake news, inaccurate forecasts of the future, and partisan hatred. Intellectual humility is good because it helps people think more clearly. As people accept that their beliefs could be wrong, they may also develop the curiosity to learn more. Intellectually humble people seem to forgive easier, are more open to opposing views, and are more intellectually resilient. But over the course of the study of intellectual humility, we may have overlooked two fatal flaws: pride and prejudice.
The Enlightenment Trap: Pride and Prejudice
Humility is at the core of many stories of enlightenment, and enlightenment seems to bring people together, catalyzing communities of collective compassion and thoughtfulness. Humble people seem to accept the flaws and foibles of other people within their community, those who are also striving for enlightenment. But what about people who sit outside those communities, those who seem less “enlightened,” who aren’t striving for humility? Having intellectual humility should make us more accepting of everyone, but some work suggests that it might not. Humility might even serve as a barrier to appreciating others; enlightenment might serve to trap us.
The enlightenment trap is the idea that humility, although initially reducing your biases, can trap you into another kind of bias—self-conceit. And while a bit of self-flattery might not be the end of the world, pride and prejudice are connected, as Jane Austin long ago recognized. Let’s explore how. Why would reducing the certainty about your beliefs make you dismissive of others? And how can humility lead to pride when humility is exactly the opposite of pride?
The first step in the enlightenment trap is a good step: you begin to recognize just how much you don’t know, and you become more humble. As time goes on, you qualify more of your statements with: “I could be totally wrong about this,” but you also begin to think: “Wow, I am so intellectually humble!” When other people fail to make these admissions of humility, you start to see others as not humble, as flawed thinkers. You see other people as having overly certain political opinions about issues that they know nothing about. And because it seems like everyone else is overconfident, dogmatic, and arrogant, your own admission of humility becomes less sincere. Saying “I could be wrong” is actually a smug way of saying “I’m very right.” Ironically, your intellectual humility has swollen your pride, making you more prejudiced as a result. And just like in Jane Austen’s novel, this pride and prejudice can stand in the way of flourishing relationships.
We all have personal experiences with the enlightenment trap. Rachel grew up in a Jewish orthodox home where the main sources of knowledge were the Torah and Talmud. The religious texts had little room for ambiguity—although Rabbis debate the minutia of Jewish law, most of the commandments are set in stone (literally and figuratively). The requirements to eat Kosher, observe the Sabbath, and dress modestly were commanded by God, and there was no questioning that. But as Rachel became more educated, she began to adopt a scientific epistemological framework for understanding the world. And she also developed more humility—she became acutely aware of just how much she didn’t know. In a sense, she has become enlightened. But she also fell into the enlightenment trap: rejecting her upbringing led her to ridicule her family members who continue to dogmatically hold on to various nonscientific beliefs. Will used to work in political campaigns where the main objective was to convince others you were right. But learning about how prone we are to motivated reasoning made him less certain about his political convictions. Perhaps this was a good thing, but he didn’t stop there. Compared to his impressive humbleness, anyone with political enthusiasm looked like an arrogant tribalist.
Our enlightenments caused problems in our relationships, as our humility-induced hubris drove us to view others with condescension. And it’s not just us either. Psychological research backs up the idea of an enlightenment trap, and they come in a variety of forms, be they intellectual or spiritual. Compared to a control group, people who participated in a yoga class/meditation group were more likely to embrace communal narcissism (when one thinks that they alone will save the world and that they are the most helpful person of them all).
The more convincing illustration of the enlightenment trap is a recent study revealing that intellectually humble people are prejudiced toward more groups than their more intellectually pompous counterparts. Let’s reiterate: those people who are apparently more humble—and more accepting of other people’s beliefs—may actually be prejudiced against more people. The authors of this paper speculate that the reason intellectually humble people are so prejudiced is because they develop a group identity surrounding their humility, and this leads them to dislike anyone who isn’t in their ingroup—i.e., anyone who is not intellectually humble. So, it seems like there’s at least one more component to the enlightenment trap: group dynamics.
The Humble Group: How Intergroup Dynamics Make It Easier to Fall into The Trap
Do people actually rally around the elusive “identity” of being intellectually humble? Indeed, they do and have been doing so for some time now. Going back to the French enlightenment, you’d find these kinds of folks in cafes discussing the power reason had to create progress. They referred to themselves as “The Philosophes, which is a French word for “Philosopher.” This group of public intellectuals used reason to undermine the dogmas of their day, from the divine right of kings to religious intolerance. The Philosophes held that a more critical, scientifically oriented populous would be able to better tackle societal dilemmas.
"Once fanaticism has corrupted a mind, the malady is almost incurable" and "the only remedy for this epidemic malady is the philosophical spirit"
- Voltaire, French Philosophe in Mahomet
The ideas of the Philosophes were extrapolated to the extreme by French Revolutionaries to produce the Cult of Reason, the Reign of Terror, and the guillotines, creating an extreme hatred towards the enemies of reason. While most Enlightenment Traps don’t turn quite so hateful and violent, the example of the French Revolution shows just how contentious epistemological divisions can become.
Today, you’ll find a similar mindset espoused not over croissants in Europe, but in the modern-day French cafes—Reddit forums and Substack comment sections—where members organize under the umbrella of “The Rationalist Community.” While the Philosophes believed strongly in the power of individual reason, the rationalist community (informed by modern psychology) takes intellectual humility a step further. They contend that individually, humans are flawed reasoners, but if you arrange them in a community where norms promote self-critical, earnest truth-seeking, people can behave more rationally. Their bonding value is not pure reason, but rather an awareness of cognitive biases. But just as the Philosophes fell into the enlightenment trap, despising the dogmatic of their day, the rationalist community faces a similar challenge today.
Scott Alexander, author of the blog Astral Codex Ten (formerly Slate Star Codex) and one of the most popular members of the rationalist community, subtly wrote about an awareness of an epistemic divide in his blog post “I Can Tolerate Anything but the Outgroup.” In this post, which criticizes many aspects of the political left, Alexander realizes that although his politics are liberal, he identifies more strongly with a different group. The rationalist community, which, as described above, has intellectual humility as one of its core values, is his real ingroup. Anyone who tends to claim absolute certainty about their beliefs (as many partisans do) is a member of the outgroup.
When our coalitional psychology dovetails with smugness about intellectual humility, we become more divided, and it feels justified. As people hang out with other intellectually humble people more, they might justify their prejudice against people who are not intellectually humble. After all, it’s not like they’re being racist, sexist, or homophobic. It’s not even partisan animosity. They might say to themselves, “I don’t really care what people think, instead I care how people think.” Similar to the way tolerant people justify intolerance of the intolerant, the humble group might justify dogmatically despising the dogmatic.
The enlightenment trap—the tendency to become intolerant toward dogmatic people as one becomes intellectually humble—is easy to fall into, and tough to climb out of. And when people begin to identify as part of a group that’s defined by its enlightenment, overcoming outgroup hostility is ever more difficult. We can’t stop people from forming groups, nor should we want to. Ingroups provide people with a sense of meaning and purpose. They provide material help and emotional comfort. We have evolved to live in groups, and finding a good community is one of the best things you can do. But when even the humble group resorts to prejudice and animosity toward its outgroup, we might want to reconsider the way we think about the outgroup.
How To Climb Out of the Enlightenment Trap
The enlightenment trap is tough to climb out of because feeling like we’re better than people is psychologically rewarding. But there are concrete steps we can take to deconstruct this narrative. First, it’s important to recognize the role of luck and external circumstances in determining whether someone will become “enlightened.” Perhaps we were fortunate enough to be surrounded by thoughtful people. Maybe we stumbled upon some enlightening books. Maybe something about the environment we grew up in induced more humility in us. It could be a factor relating to our personalities that we had no control over. In a recent survey (manuscript in prep) we asked members of the rationalist community what they think led them to approach opposing viewpoints with curiosity, and most participants identified factors outside their locus of control. Thus, it might be good to acknowledge that, if circumstances were different, you and your dogmatic friend might belong to opposition groups.
We also want to encourage our enlightened readers to maintain curiosity about others, even those they view as members of the dogmatic outgroup. To use Julia Galef’s terminology, we encourage people to have a scout mindset. In her book, Galef dichotomizes open-minded and closed-minded approaches to knowledge as a “scout mindset” or a “soldier mindset”, respectively. When it comes to what we believe, Soldiers see what they want to see. They believe what’s comfortable, what their ingroup believes, or what they wish were true. Scouts, on the other hand, don’t have a stake in defending beliefs one way or the other, rather, their primary concern is to survey the territory and return with as accurate a map of reality as possible. Galef encourages us to be more like Scouts, which is an admirable aim. But what Galef doesn’t mention is that in a community of Scouts, the natural thing to do is to demean the Soldiers. However, we believe people can overcome this tendency if they make an effort to have a scout mindset about people with a soldier mindset. This could lead us to overcome our pride and prejudice, helping us find a solution to intolerance through intellectual humility.
Enlightenment—gaining new knowledge and developing intellectual humility about all which one does not know—is a good thing. It can lead to a better understanding of reality and create a society that is more oriented toward the truth. But on the journey toward enlightenment lies a hidden trap—the tendency to develop animosity toward those who are not on the same path. We’ve prompted you to think about the circumstances that make enlightenment an easier path for some and encouraged you to approach even the most dogmatic people in your life with the mindset of a scout. Though the trap is easy to fall into, we hope that these tools can help you free yourself from the trap and continue on your journey. And should you dogmatically choose to ignore everything we’ve said, we won’t hold it against you.
From Intellectual Humility to Moral Curiosity
We’ve discussed the promises—and pitfalls—of intellectual humility for bridging political divides. But many of the divides we see between the Left and Right are not intellectual divides, they are moral divides. We aren’t disagreeing as much about facts as we are about right and wrong (though there definitely are also factual disagreements). We disagree about how much parents should have a say about their children’s education, and how to balance concerns of justice and fairness. We disagree about whether fetuses deserve moral rights, whether the death penalty is a suitable punishment, and whether people should have the right to terminate their own lives. These are all moral questions. And they might require something like moral humility, as opposed to intellectual humility. Several researchers have begun exploring this direction, with constructs like Moral Tolerance and Receptiveness to Opposing Views. But it might also be fruitful to explore a more active approach for bridging divides—rather than the mere willingness to encounter and tolerate opposing views, we might want to explore methods for stimulating people’s curiosity about others’ morality. Our work in this area is still in its early stages, but we’ve seen some promising results—people higher in moral curiosity tend to be less prejudiced toward their political opponents.
Ultimately, we believe both humility and curiosity are likely important for bridging divides, and that it is probably a good idea to focus both on intellectual disagreements as well as moral differences. So when you come across someone who disagrees with you, whether it’s about the facts or their moral implications, consider what you might be able to learn from that person. Try to cultivate humility about your own knowledge and values, as well as a curiosity to learn about the ideas of your interlocutor. And if you do successfully become enlightened, don’t let it go to your head.
I think the problem is about confusion arising from word choices: "humility," "prejudice," and "enlightenment." Before I explain that, I think it would be useful to explain the perspective I'm coming from.
I'm a Pyrrhonist. Pyrrhonism is one of the philosophies of life that arose in ancient Greece. I'm the author of "Pyrrho's Way: The Ancient Greek Version of Buddhism."
Most of what we know about ancient Pyrrhonism is through the surviving works of the Pyrrhonist philosopher, Sextus Empiricus. Like Stoicism and Epicureanism, Pyrrhonism is a philosophy of life, and it involves certain practices. The core practices of Pyrrhonism are about cultivating epistemic humility and an attitude of investigation.
In Sextus' works one can see the kind of outgroup issue described in your article. Most of his attention goes to criticizing the doctrines of competing philosophies and of certain schools of thought. These doctrines are "dogmas" - the transliterated Greek term that is the source for the English term "dogma." The Greek term has a somewhat different meaning than the English term, and Sextus uses it in a technical sense to mean a firm belief in something non-empirical, such as the Stoic dogma that virtue is the only good. So when I say "dogma" I mean it in that sense.
While Sextus usually criticizes dogmas as rash conclusions, when those dogmas are contradicted by empirical evidence, his criticism becomes harsher. The harshest examples are in his book, "Against the Astrologers." Unlike his criticisms of competing philosophies, where he usually argues that the dogmas are rash and disputable, he argues that the astrological dogmas are demonstrably false.
With regard to his own beliefs, Sextus could be said to be a model of intellectual humility. The problem is that "humility" does not describe his attitude about dogmas. While it's true that he is interested in investigating dogmas, and in that sense has a scout mentality, and he's willing to concede that some dogmas may be correct, he's insistent on proving that dogmas are at best unproven or at worst contrary to empirical evidence.
It's not that there's something wrong or paradoxical that Sextus is doing here. It's that "intellectual humility" should not be considered a form of humility and it should not be called by that term. Sextus doesn't describe what he does in any term related to humility. Rather he describes what the dogmatists do in terms opposite of humility. Hence Pyrrhonists are not humble; they simply lack the conceit of the dogmatists. In other words, it's not "intellectual humility" but "intellectual non-conceitedness." The absence of conceit is not the same as the presence of humility. There's no self-abasement, no sense of unworthiness, no modest self-assessment.
"Prejudice" isn't correct because that term means coming to judgment without investigation. That's the opposite of what Pyrrhonists do. I think that all that's going on here is simple in-group/out-group issues, and a preference for associating with the like-minded. If someone says they are a Stoic, then that gives one a lot of information that can be used to judge that person. This is not mere prejudice.
The term "enlightenment" is also a problem. Pyrrhonism doesn't use the term "enlightenment," but we do talk about there being an ah-ha experience associated with seeing the world from the Pyrrhonist perspective. Yet to use the word "enlightenment" would imply a hierarchical relationship such as the kind Plato implies with his allegory of the cave. But we have no such enlightenment to give. All we offer is a shattering of dogmatic illusions.
One of the few positions that Sextus Empiricus could be said to take is that good and evil do not exist by nature - meaning that they arise from the judments of humans. This is a key part of the Pyrrhonist psycho-therapeutic approach. I don't think it should be called "moral humility" though. It's more of a realistic assessment of the flimsiness of all moral judgments.
Great piece.