Think about the last time you engaged in a discussion with someone whom you fundamentally disagreed. Maybe the person was misinformed or just plain ignorant about a topic that you have researched and studied and have very strong convictions about. Take climate change, for example. Many of the most vocal climate change deniers will freely admit they aren’t “experts” before launching into a litany of reasons why the science is wrong.
A classic example of misinformed ignorance posing as an expert is Senator James Inhofe (R-OK) (and not just because his “proof” of the global warming hoax was the snowball he brought for Senate show-and-tell.) Inhofe has repeatedly maintained that “man-made global warming is the greatest hoax ever perpetrated on the American people. It’s worth noting that, According to Oil Change International, Inhofe has received over $2 million in donations from the fossil fuel industry. His ignorance is astounding, as is his inability to hear any kind of data, research, or information that conflicts with his position.
Don’t get me wrong. Scientific skepticism is healthy – necessary even. It forces scientists to examine claims (their own and those of others) and systematically question all information in search of flaws and fallacies. But, deniers like Inhofe vigorously criticize any evidence that substantiates climate change and embrace any argument that refutes it. Presenting actual facts and data that challenge their thinking just makes them dig in their heels and even more sure of their position.
Skepticism is healthy both for science and society. Denial is irresponsible and dangerous.
There are several related cognitive processes that explain what’s going on here.
Confirmation Bias
Confirmation bias is that unconscious force that nudges us to seek out information that aligns with our existing belief system. Once we have formed a view, we embrace information that confirms that view while ignoring information to the contrary. We pick out the bits of data that confirm our prejudices because it makes us feel good to be “right.” When we want to be right badly enough, we become prisoners of inaccurate assumptions.
The Backfire Effect
A second cousin to confirmation bias is the backfire effect. Not only do we search out information consistent with our beliefs, but we instinctively and unconsciously protect those beliefs when confronted with information that conflicts with them. It’s an instinctive defense mechanism that kicks in when someone presents information – even statistically sound research or data – that disputes your position.
A 2006 study examined why sound evidence fails to correct misperceptions. Subjects read fake news articles that included a misleading claim from a politician, or a misleading claim and a correction about polarizing political issues. People on opposing sides of the political spectrum read the same articles and then the same corrections, and when new evidence threatened their existing beliefs, they doubled down. The corrections backfired. The evidence made them more certain that their original beliefs were correct.
The Dunning-Kruger Effect
The Dunning-Kruger Effect is based upon the notion that we all have pockets of incompetence with an inverse correlation between knowledge or skills and confidence. People who are ignorant or unskilled in a particular subject area tend to believe they are much more competent than they are. Bad drivers believe they're good drivers, cheapskates think they are generous, and people with no leadership skills think they can rule the world. How hard can it be?
It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” - Mark Twain
In Moral Disagreements, Experiences Beat Facts
New research suggests people respect those they disagree with more when their position comes from a place of personal experience, not facts and figures. The truth is that we often feel, rather than think, our way toward a particular position or viewpoint. People form opinions based on emotions, such as fear, contempt and anger, rather than relying on facts.
A wealth of experiments focused on hot-button issues such as gun control, the economy, and abortion involving thousands of participants found that arguments expressing relevant personal experiences won out over fact-based strategies. Personal stories in which people share experience of vulnerability, personal challenges or suffering are often the most compelling.
However, facts aren’t entirely useless. Often the most productive conversations between people with opposing viewpoints involve a combination of both personal experiences and facts. But facts alone don't win arguments.
The human brain is neither a rational thinking machine or a feeling machine. It's a feeling machine that sometimes thinks rationally.
My shadow of light, Melissa, always illuminates and clarifies. If I hadn't already voted, I'd write you in.