Don’t worry: I’m sure they’ll have a pill to treat your “motivated reasoning” disorder soon.
Like many in the USA, especially perhaps those of us not allowed to go to work right now or with reason to feel anxious about the future, I’m particularly pre-occupied by these issues at the moment, and the various impediments to clear, community-spirited thinking.
And then researchers serendipitously dropped another relevant study on cognitive bias into the literature. Lewandowsky, Gignac and Oberauer conducted a study of people in the USA. It’s here in PLOS One.
They discuss a way that science communication processes can sometimes backfire. When the views of people seen as experts converge on an issue, it has a strong influence on other people’s thinking. So generally, a strong scientific consensus can be convincing to many others, too. Climate science is an example where growing consensus among scientists reduced the influence of climate change denial.
However, in people who are prone to conspiracist thinking, strong consensus around science can have the reverse effect: it can be seen as evidence that they’re all in cahoots. As happens for some people with vaccination, say. Presenting yet more facts or another study could paradoxically confirm their rejection of science.
The study’s authors describe conspiracist thinking as a cognitive style that doesn’t have to conform to expectations of coherence or consistency: its “explanatory reach” is therefore greater than competing scientific theories. Yet, it can also provide an explanation of why a consensus is wrong.