Have you ever noticed that when you present people with facts that are contrary to their deepest held beliefs they always change their minds? Me neither. In fact, people seem to double down on their beliefs in the teeth of overwhelming evidence against them. The reason is related to the worldview perceived to be under threat by the conflicting data.
Creationists, for example, dispute the evidence for evolution in fossils and DNA because they are concerned about secular forces encroaching on religious faith. The vaxxers believe that big pharma is not about making profit and refuse to look at all de evidence that vaccination creates more death and disease than being unvaccinated. They refuse to see the correlation between the amount off vaccines and the raise in autism. Statist believe that the state is the only solution to chaos and havoc and when you show them that state has killed so many that could have never happened in a anarchist society, they ignore in total disbelieve . Many people disbelieve the government uses certain elements to create false flags like 9/11. If you show them prove that building WT7 came down by explosions and not by an office fire they call it complete nonsense and ask why the government would ever do this. People who believe the climate change is manmade have no answer if you tell them that methane is also one of the gasses that supposedly cause global warming and that most of it is generated by our cattle for meat consumption and nothing is done to reduce this meat consumption. Even while this would reverse the cutting down of the rainforest in favour of our climate. Obama birthers desperately dissected the president’s long-form birth certificate in search of fraud because the copy he gave is photoshopped.
In these examples, proponents’ deepest held worldviews were perceived to be threatened by skeptics, making facts the enemy to be slayed. This power of belief over evidence is the result of two factors: cognitive dissonance and the backfire effect. In the classic 2003 attack on Irak because it supposedly had WMD and Saddam Hussain was working together with Osama Bin Laden. Instead of admitting error, “members of the group sought frantically to convince the world of their beliefs that it was to bring democracy,” and they made “a series of desperate attempts to erase their rankling dissonance by making prediction after prediction in the hope that one would come true.” Festinger called this cognitive dissonance, or the uncomfortable tension that comes from holding two conflicting thoughts simultaneously.
Two social psychologists, Carol Tavris and Elliot Aronson (a former student of Festinger), in their 2007 book Mistakes Were Made (But Not by Me) document thousands of experiments demonstrating how people spin-doctor facts to fit preconceived beliefs to reduce dissonance. Their metaphor of the “pyramid of choice” places two individuals side by side at the apex of the pyramid and shows how quickly they diverge and end up at the bottom opposite corners of the base as they each stake out a position to defend.
In a series of experiments by Dartmouth College professor Brendan Nyhan and University of Exeter professor Jason Reifler, the researchers identify a related factor they call the backfire effect “in which corrections actually increase misperceptions among the group in question.” Why? “Because it threatens their worldview or self-concept.” For example, subjects were given fake newspaper articles that confirmed widespread misconceptions, such as that there were weapons of mass destruction in Iraq. When subjects were then given a corrective article that WMD were never found, liberals who opposed the war accepted the new article and rejected the old, whereas conservatives who supported the war did the opposite … and more: they reported being even more convinced there were WMD after the correction, arguing that this only proved that Saddam Hussein hid or destroyed them. In fact, Nyhan and Reifler note, among many conservatives “the belief that Iraq possessed WMD immediately before the U.S. invasion persisted long after the Bush administration itself concluded otherwise.”
If corrective facts only make matters worse, what can we do to convince people of the error of their beliefs? From my experience, 1. keep emotions out of the exchange, 2. discuss, don’t attack (no ad hominem and no ad Hitlerum), 3. listen carefully and try to articulate the other position accurately, 4. show respect, 5. acknowledge that you understand why someone might hold that opinion, and 6. try to show how changing facts does not necessarily mean changing worldviews. These strategies may not always work to change people’s minds, but now that the nation has just been put through a political fact-check wringer, they may help reduce unnecessary divisiveness.