354. questioning bias

Most of us are searching for the right answers because being right helps us orient our lives. Certainty allows us to believe we are making the best possible decisions with the information we have. We want to know the right way to eat, the right way to think, the right way to live. We want to believe that if we gather enough information, follow the right people, or listen to the right authorities, we can arrive at conclusions solid enough to stand on. In a world that feels increasingly unstable, certainty functions like a handrail. It gives us something to hold onto so we can keep moving forward without the constant feeling that everything might fall apart.

The problem isn't that we want answers. The problem is how quickly we forget where those answers come from.

We grow up in a culture that teaches us there are right and wrong answers. In school, in work, in law, and in medicine, the world is presented as something absolute, something that can be mastered through accumulation. Learn enough. Study hard enough. And eventually, you will know enough to be right. That framework works well in certain domains. For example, take a stop sign. If we don't collectively agree to stop, people get hurt. The exact duration of the stop can be debated, but the rule itself must be shared, enforced, and largely unquestioned for society to function. In those cases, certainty isn't just helpful. It's necessary.

But once we move beyond basic coordination, certainty becomes harder to justify and easier to misuse. Questions about health, politics, morality, identity, or meaning don't submit to the same kind of clean resolution. They are shaped by experience, environment, incentives, and the information ecosystems we inhabit, most of which we didn't choose. Yet we often hold our personal conclusions in these areas as if they carry the same weight as collectively agreed-upon rules. We defend them with the same confidence and moral force, as though disagreement can only mean ignorance, irrationality, or bad faith.

That assumption starts to break down when you spend time with the ideas in How Minds Change. One of the book’s most useful contributions is not telling us what to think, but showing us how much of what we think is formed before we ever become aware of it. The world we experience isn't a direct, one-to-one copy of reality. It's a working model built by the brain, shaped by past experiences and used to make sense of incomplete information. Our expectations quietly guide what we notice and how we interpret it, long before we form conscious beliefs about what we're seeing.

The example that makes this hard to ignore is the image known as “the dress,” where some people saw white and gold while others saw black and blue. What made the moment unsettling wasn't simply that people disagreed. It was that each side felt certain. There was no sense of ambiguity. No awareness that interpretation was happening at all. Just the strong impression that the truth was obvious, and that anyone who saw it differently must be mistaken. The same image entered everyone’s eyes, yet different brains arrived at different conclusions based on past experience with lighting conditions, all without conscious awareness. What felt like objective perception was, in reality, interpretation doing its work quietly and convincingly.

Follow-up research showed that people who spent more time under artificial lighting were more likely to see the dress as black and blue, while those more accustomed to natural daylight were more likely to see it as white and gold. The brain wasn't simply receiving information. It was adjusting what it saw based on what it expected the lighting to be, filling in gaps automatically. No one experienced this as guessing. The uncertainty never reached conscious thought. By the time awareness showed up, the conclusion already felt settled.

This example matters because it reveals something uncomfortable. If our brains routinely resolve uncertainty for us without letting us know, then confidence isn't evidence that we are right. It's often just evidence that our mind has landed on an answer and moved on. Once that happens, we stop looking. We stop asking what else might be true. We stop noticing the assumptions built into our conclusions. What feels like clarity is often just the relief of having an answer.

This is where ideas like these are often misunderstood and reduced to the phrase “everyone has their own reality.” On the surface, it sounds compassionate. But taken too far, it becomes a way to excuse intellectual laziness or justify disruption without responsibility. Societies can't function without shared structures. We can't all decide independently what a red light means. We can't coordinate complex systems if every disagreement is treated as equally valid simply because it feels sincere. Subjectivity explains why people disagree. It doesn't erase the consequences of what we choose to challenge or uphold.

The more honest position lives in the tension between these two truths. We need shared agreements in order to live together, and we also need humility about how our personal convictions were formed. Bias itself is not the problem. Bias is unavoidable. It's how we orient ourselves in a complex world. The problem is forgetting that our bias is doing that work at all, and then mistaking that orientation for absolute truth.

This is where the ideas in the book become unintentionally revealing. A framework designed to show how flexible belief can be can quietly create the illusion that once you understand it, you have somehow stepped outside of it. When we learn how beliefs form, it becomes tempting to think those forces explain other people more than they explain us. But recognizing a tendency doesn't remove it. Understanding how beliefs take shape doesn't place us above the process. In some cases, it simply gives us better language to defend what we already believe.

That doesn't make the framework useless. It makes it human. The moment we try to explain anything, we adopt a perspective. And perspective always carries authority. This isn't a personal failure. It's structural. But noticing it matters, because it prevents insight from quietly hardening into another unquestioned certainty.

What all of this leaves us with isn't the idea that nothing is true, but the responsibility to examine how we arrived at what we believe. You're allowed to have opinions. You're allowed to believe some explanations are better than others. You're allowed to act on those beliefs and advocate for them. What you can't do, if you take this seriously, is pretend that your certainty arrived untouched. There is a difference between a belief shaped through lived experience and reflection, and a belief held simply because it was repeated often enough or delivered by an authority and never questioned. A belief may still be useful, but usefulness is not the same thing as understanding.

The harder work isn't eliminating bias. It's staying curious about it. Asking why a belief feels non-negotiable. Asking what would feel threatened if it turned out to be incomplete. Asking whether a disagreement is really about facts, or about identity, safety, and the need to feel oriented in the world. Some disagreements should be challenged because they undermine shared structure or cause harm. Others should be tolerated because they represent different ways of making sense of the same uncertainty. Knowing the difference requires more than information. It requires restraint.

The goal isn't to live without bias. That is impossible. The goal is to hold our bias lightly enough that it doesn't harden into dogma, and firmly enough that it still allows us to act. Certainty will always be tempting because it makes life feel manageable. But if there is one thing worth carrying forward, it's this: feeling right is not the same as being right, and the moment our certainty feels most obvious is often the moment it deserves the most scrutiny.

Next
Next

353. how i show up