Misinformation sticks. Erasing ‘fake news’ from your memory is as difficult as getting jam off your fingers after a Devonshire tea.
Once you hammer into people that there are Weapons of Mass Destruction (WMDs) in Iraq, it doesn’t matter that none were found after the country was thoroughly scoured by the invading forces. The constant drumbeat of ‘WMD, WMD, WMD’ in the lead-up to the invasion, followed by innumerable media reports of ‘preliminary tests’ that tested positive for chemical weapons during the early stages of the conflict – but ultimately were never confirmed by more thorough follow-up tests – created a powerful impression that those weapons had been discovered. An impression so powerful that four years after the absence of WMDs became the official US position, 60% of Republicans and 20% of Democrats believed either that the US had found WMDs or that Iraq had them but had hidden the weapons so well that they escaped detection.
Misinformation can stick even when people acknowledge a correction, and know that a piece of information is false.
Misinformation sticks and is hard to dislodge.
In a study conducted during the initial stages of the invasion of Iraq, colleagues and I presented participants with specific war-related items from the news media, some of which had been subsequently corrected, and asked for ratings of belief as well as memory for the original information and its correction. We found that US participants who were certain that the information had been retracted, continued to believe it to be true.
This ‘I know it’s false but I think it’s true’ behaviour is the signature of the stickiness of misinformation. Misinformation sticks even in situations in which people have no ideological or motivational incentive to stick to their erroneous beliefs.
In the laboratory, the original misinformation shines through in people’s responses to inference questions when they are presented with entirely fictional but plausible scripts about various events. For example, people will act as though a fictitious warehouse fire was due to negligence even if, later in the script, they are told the evidence pointing to negligence turned out to be false.
Is there any way to unstick misinformation?
There is broad agreement in the literature that combating misinformation requires that the correction be accompanied by a causal alternative. Telling people that negligence was not a factor in a warehouse fire is insufficient – but telling them that arson was to blame instead will successfully prevent any future reliance on the negligence idea.
Another way to combat misinformation is to prevent it from sticking in the first place.
An ounce of inoculation turns out to be worth a pound of corrections and causal alternatives. If people are made aware that they might be misled before the misinformation is presented, there is evidence that people become resilient to the misinformation. This process is variously known as ‘inoculation’ or ‘prebunking’ and it comes in a number of different forms.
At the most general level, an up-front warning may be sufficient to reduce – but not eliminate – subsequent reliance on misinformation.
We can prevent it from sticking in the first place by alerting people to how they might be misled.
In one of my studies, led by Ullrich Ecker, we found that telling participants at the outset that ‘the media sometimes does not check facts before publishing information that turns out to be inaccurate’ reduced reliance modestly (but significantly) in comparison to a retraction-only condition. A more specific warning that explained that ‘research has shown that people continue to rely on outdated information even when it has been retracted or corrected’, by contrast, reduced subsequent reliance on misinformation to the same level as was observed with a causal alternative.
A more involved variant of inoculation not only provides an explicit warning of the impending threat of misinformation, but it additionally refutes an anticipated argument that exposes the imminent fallacy. In the same way that a vaccination stimulates the body into generating antibodies by imitating an infection, which can then fight the real disease when an actual infection occurs, psychological inoculation stimulates the generation of counter-arguments that prevent subsequent misinformation from sticking.
The inoculation idea can be illustrated with an example from climate change. Although there is a pervasive scientific consensus – reliant on 150-year-old basic physics and 15,000 modern scientific articles – that the Earth is warming from the burning of fossil fuels, political operatives often seek to undermine that consensus to introduce doubt about those scientific facts in the public’s mind.
John Cook, Ullrich Ecker, and I showed that people can be inoculated against those disinformation efforts by presenting them with:
- a warning that attempts are made to cast doubt on the scientific consensus for political reasons
- an explanation that one disinformation technique involves appeals to dissenting ‘fake experts’ to feign a lack of consensus.
We illustrated the ‘fake-expert’ approach by revealing the attempts of the tobacco industry to undermine the medical consensus about the health risks from smoking with advertising claims such as ‘20,679 Physicians say ‘Luckies are less irritating’’.
By exposing the fake-expert disinformation strategy at the outset, the subsequent misinformation (in this case, the feigned lack of consensus on climate change) was defanged and people’s responses did not differ from a control condition that received no misinformation about the consensus. (Whereas in the absence of inoculation, that misinformation had a detrimental effect.)
Misinformation sticks and is hard to dislodge.
But we can prevent it from sticking in the first place by alerting people to how they might be misled.
Read more
- John Cook, Stephan Lewandowsky, Ullrich K. H. Ecker. 2017 Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PLoS ONE, 125: e0175799. Available at: https://goo.gl/VQ5hkC
- Gary C. Jacobson. 2010, Perception, Memory, and Partisan Polarization on the Iraq War. Political Science Quarterly, 125: 31-56. Available at: https://goo.gl/oDmVNW
- Stephan Lewandowsky, Werner G.K. Stritzke, Klaus Oberauer, Michael Morales. 2005. Memory for Fact, Fiction, and Misinformation: The Iraq War 2003. Psychological Science
- Vol 16, Issue 3, pp. 190 – 195. Available at: http://journals.sagepub.com/doi/abs/10.1111/j.0956-7976.2005.00802.x
- Stephan Lewandowsky, Ullrich K. H.Ecker, Colleen M. Seifert, Norbert Schwarz, John Cook. 2012. Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13 (3): 106-131. Available at: https://goo.gl/48LSKs
- Sander van der Linden, Edward Maibach, John Cook, Anthony Leiserowitz, and Stephan Lewandowsky. 2017. Inoculating against misinformation. Science 3586367, 1141-114. Available at: https://goo.gl/AmHEXJ
Copyright Information
As part of CREST’s commitment to open access research, this text is available under a Creative Commons BY-NC-SA 4.0 licence. Please refer to our Copyright page for full details.