The Boston Globe recently posted an essay discussing how people’s reluctance to adjust their beliefs to new information affects the democratic political discourse.
Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote.
Actually, the link between people thinking like a scientist and democracy is quite strong. After all, an election is a type of measurement. Given all the complexities of the real world, we are asked to figure out what is actually true and make quality decisions that will affect everyone in our communities. But, what we discover is that we aren’t very good at doing this. We often come to the wrong conclusions because our mental wiring isn’t up to the task. Too many people are unfamiliar with the tools for ensuring our reasoning has turned up the right solutions.
Traditional education is not a good solution to this problem. Being told what’s true is ineffective. Instead, a certain type of integrity needs to become part of our culture. Thinking like a scientist needs to be a value for everyone who votes, not just for scientists.