Pieces of information, disputed and not, can be woven very quickly into competing explanatory narratives. Press the right buttons, then it can be lightning fast to order them into a line leading to this or that logical conclusion.
Those narratives can cause feuds that won’t quit and entrenched positions of extreme certitude from which people can’t budge. These days, it seems to be accompanied by a trend not just towards uncivil discourse, but a reveling in the power of incivility. As though an increase in aggression – as a first resort even – can make our society better.
It’s always been understandable to me, when people choose that path. In my early days as an activist, I went there too, ignoring to some extent the difficulty with reconciling that with my deep admiration for the heroes of non-violent action: non-violence in word and deed. Like many traversing the same issue within the context of a great desperate challenge of my generation of activists, I found it painful at first to try to reach for the integrity of other ideals from a vulnerable place.
I still don’t find it easy. Nor am I particularly great at it, although I seem to have less cause for bouts of shame and regret as the decades go by. Experience ultimately came into sync with my aspirations: although far harder than whipping up frenzy, non-aggression is usually ultimately more powerful.
Researchers dropped another relevant study on cognitive bias into the literature. Lewandowsky, Gignac and Oberauer conducted a study of people in the USA. It’s here in PLOS One.
They discuss a way that science communication processes can sometimes backfire. When the views of people seen as experts converge on an issue, it has a strong influence on other people’s thinking. So generally, a strong scientific consensus can be convincing to many others, too. They cite climate science as an example where consensus among scientists may influence climate change denial.
However, in people who are prone to conspiracist thinking, strong consensus around science can have the reverse effect: it can be seen as evidence that they’re all in cahoots. As happens for some people with vaccination, say. Presenting yet more facts or another study could paradoxically confirm their rejection of science.
The study’s authors describe conspiracist thinking as a cognitive style that doesn’t have to conform to expectations of coherence or consistency: its “explanatory reach” is therefore greater than competing scientific theories. Yet, it can also provide an explanation of why a consensus is wrong.
Lewandowsky and his colleagues surveyed Americans’ attitudes to two issues where views on science are polarized: climate science and GM foods. Based on their sample, conspiratorial ideation could, they conjecture, be a more consistently explanatory factor in science denialism than people’s educational levels or world views.
Cultural or political world view and conspiracist thinking may be close relatives. They could both be seen as motivated reasoning, according to Lewandowsky: “Motivated reasoning refers to the discounting of information or evidence that challenges one’s prior beliefs accompanied by uncritical acceptance of anything that is attitude-consonant.” (Motivated reasoning was described by Ziva Kunda in 1990 [PDF].)
World view can be associated with some types of science rejection – but not others. Thus, people with a “conservative” political world view could be more likely to reject climate science than “liberals,” but less likely, say, to reject childhood vaccination. And people with a more “conservative” world view who are more highly educated could be more skeptical of climate science than those who have fewer years of education.
How might it be countered? One of the articles they point to on this question, is another Lewandowsky paper – about misinformation and countering it. Misinformation, it’s argued there, can be worse than ignorance. When you’re not informed, you could fall back on heuristics that could have a lower chance of leading you astray than when you’re misinformed. And it might be easier to acquire information than to wipe away misinformation.
When people have an organized explanatory narrative, they may need a complete functional narrative to replace it, not just isolated bits of information that break the internal logic. A logical and respectful explanation of how the mistaken belief arose might be useful. And hearing it repeatedly might help. Lots of food for thought and experimental research in science communication. But a long way to go before we can be sure about how to stop motivated reasoning.
The photo of participants of the 50th Anniversary of the Global Carbon Dioxide Record Symposium and Celebration in Hawaii 2007 is from the photo library of NOAA (National Oceanic and Atmospheric Administration), via Wikimedia Commons.
Correction: On 15 October, a comment by Robert Starkey alerted me to the fact that I had made an inaccurate assumption that consensus among scientists had reduced climate change denial. I’m grateful for the feedback and the sentence now specifically refers to the findings of three studies cited by Lewandowsky.
Update 21 March 2014: Concerns about potential for defamation actions led to a journal’s removal of an article – by Lewandowsky and colleagues: note, it is not a retraction. It is now available here, with a notice that reads in part: “This article is now posted on a website of the University of Western Australia, which has come to a different assessment of the risk posed by this article and reaffirms its commitment to academic freedom.”
Update 20 February 2016: Title shortened, cartoon updated and mention of a contemporaneous issue that is now obscure removed (a second was removed by rewriting the final sentence on 6 March), added link to Ziva Kunda’s 1990 paper on motivated reasoning.
* The thoughts Hilda Bastian expresses here at Absolutely Maybe are personal, and do not necessarily reflect the views of the National Institutes of Health or the U.S. Department of Health and Human Services.