In the current polarized climate, it’s easy to find yourself in the midst of a political disagreement that morphs into a religious argument.
The seeming futility in arguing about politics and religion may arise partly because people misunderstand the nature of these beliefs. Many people approach an ideological disagreement the same way they would a disagreement about facts.
If you disagree with someone about when water freezes, facts are convincing. It’s easy to think that if you disagree with someone about immigration, facts will be similarly persuasive.
This might work if people’s ideological beliefs worked the same way as their factual beliefs – but they don’t. As psychologists who focus on religious and moral cognition, my colleagues and I are investigating how people understand that these are two separate classes of belief.
Our work suggests that an effective strategy for disagreement involves approaching ideological beliefs as a combination of fact and opinion.
Identifying a difference
To investigate whether people distinguish between facts and religious beliefs, my colleagues and I examined a database containing more than 520 million words from speeches, novels, newspapers and other sources.
Religious statements were typically preceded by the phrase “believe that” rather than “think that”.
Phrases like “I believe that Jesus turned water into wine” were relatively common, whereas phrases like “I think that Jesus turned water into wine” were nearly nonexistent.
In four subsequent experiments, we asked adults to complete sentences like “Zane __ that Jesus turned water into wine.” Participants were more likely to use “believes” for religious and political claims and “thinks” for factual claims.
Taken together, these results suggest that people distinguish between factual beliefs, on the one hand, and religious and political claims, on the other.
Rather than equating ideologies and facts, people appear to view ideologies as a combination of fact and opinion. In two earlier studies, 5- to 10-year-old children and adults learned about pairs of characters who disagreed about religious, factual and opinion-based statements.
For example, we told participants that one person thought that God could hear prayers while the other didn’t, or that two other people disagreed about whether or not blue is the prettiest color.
Participants said that only one person could be right nearly every time they heard a factual disagreement, but they gave this answer less often when they heard a religious disagreement and less often still when they heard an opinion-based disagreement.
This result may occur because children and adults think that different types of beliefs provide different information. Participants told us that factual claims reveal information about the world, whereas opinions reveal information about the speaker.
They also reported that religious claims reveal a moderate amount of information about both the world and the speaker. People who say that
God exists are ostensibly making a claim about what kinds of beings exist in the world – but not everyone would agree with that claim, so they are also revealing information about themselves.
Recognizing the difference in everyday life
So how can you use our results when a contentious topic arises outside the lab?
When you find yourself in the midst of an ideological disagreement, it can be tempting to correct the other person’s facts.
Yet this type of information alone is often insufficient to resolve disagreements. It’s addressing the part of ideological beliefs that is like a fact, the part where someone is trying to communicate information about the world.
But it’s missing the part where ideological beliefs are also like an opinion. Without this part, saying, “Actually, evidence shows that X” sounds a lot like saying, “Actually, evidence proves that blue is not the prettiest color.”
To be convincing, you need tools that address both the fact part and the opinion part of an ideology.
People rarely change their opinions because someone out-argued them. Rather, opinion-based change can come from exposure. People like the familiar, even when that familiarity comes from the briefest of prior exposures.
The same could occur for viewpoints that they’ve heard before.
What does exposure look like when talking about ideological disagreements?
“Hmm. I actually think something different.” “I really appreciated the way my science tutor was patient with me when I didn’t understand evolution. The way she explained things made a lot of sense to me after a while.” “I’m going to donate money to groups helping asylum seekers. Do you want to join me?”
Maybe you say just one of these sentences, but others pick up where you left off. By walking around in the world, someone might encounter numerous counterpoints to their opinions, perhaps leading to gradual change as other views become more familiar.
It’s not anyone’s responsibility to say these sentences, least of all people who are being harmed by the disagreement.
But for those in a position to change minds via repeated exposure, this strategy can be a helpful addition to the “managing disagreement” toolboxes everyone carries.