I thoroughly enjoy myth-busting. It’s a good way to learn, it potentially saves you time and money, and it’s just plain fun. It’s also a big part of why I love science – the best myth-busting method ever (I’m a science fan-boy).
But it’s not always fun and games.
Correcting misinformation can be incredibly frustrating. And when it comes to topics of science, health, and politics, the stakes can be very high. It’s not just about the lack of public understanding – it can cost lives.
Misinformation has a pesky tendency to stick inside our brains, even after it’s been corrected. And when the misinformation jives with the way people see the world, trying to correct them can actually backfire, increasing their misguided beliefs.
Psychologists have been studying these things for a while. Recently, a research article has been published that discusses how misinformation spreads, how it sticks, and the best ways to correct it.
Here are some of my favorite lessons from the article:
Note: this article is sort of “dedicated” to my fellow writers / bloggers out there, so it may be a little longer and more “wordy” than my usual stuff. Just a heads up! Enjoy.
Another note: Misinformation refers to any piece of information that is initially accepted as true, but is then later corrected / retracted. (according to the article reviewed here)
And another thing: Association for Psychological Science, I know I quoted some of the articles text, and posted the diagram, but please don’t sue me (besides, it’s fair use: education and/or criticism/commentary – plus I’m sending you web traffic). :)
The journal Psychological Science in the Public Interest recently published a free, open access article titled “Misinformation and Its Correction: Continued Influence and Successful Debiasing” 1 – which can be found here.
Rarely will I do this, but I recommend you actually read the article (if you have time). It’s actually quite readable for an academic journal article. It’s written in somewhat plain language and uses many real world examples. Plus there’s a nice diagram at the end.
The Weight of the Problem
At the beginning of the article, the authors show examples of misinformation, how it spreads, how difficult it is to correct, and its consequences. Being a healthcare professional, here’s my favorite example:
“In the United Kingdom, a 1998 study suggesting a link between a common childhood vaccine and autism generated considerable fear in the general public concerning the safety of the vaccine. The UK Department of Health and several other health organizations immediately pointed to the lack of evidence for such claims and urged parents not to reject the vaccine. The media subsequently widely reported that none of the original claims had been substantiated. Nonetheless, in 2002, between 20% and 25% of the public continued to believe in the vaccine-autism link, and a further 39% to 53% continued to believe there was equal evidence on both sides of the debate. More worryingly still, a substantial number of health professionals continued to believe the unsubstantiated claims. Ultimately, it emerged that the first author of the study had failed to disclose a significant conflict of interest; thereafter, most of the coauthors distanced themselves from the study, the journal officially retracted the article, and the first author was eventually found guilty of misconduct and lost his license to practice medicine.”
And what’s the harm?
“… following the unsubstantiated claims of a vaccination-autism link, many parents decided not to immunize their children, which has had dire consequences for both individuals and societies, including a marked increase in vaccine-preventable disease and hence preventable hospitalizations, deaths, and the unnecessary expenditure of large amounts of money for follow-up research and public-information campaigns aimed at rectifying the situation.”
Pretty powerful stuff.
How does Misinformation Spread, Stick, and Resist Correction?
The article goes in depth to explain where misinformation can come from (rumors, fiction, governments / politicians, vested interests, and the media), and how people process information to determine whether or not it’s true.
Essentially, people ask themselves four questions: 1. Is this information compatible with what I already believe? 2. Does this information make a coherent story? 3. Does it come from a credible source? and 4. Do other people believe it? Most of the time, the odds are in favor of something being accepted as true. There is even some evidence that in order to comprehend something, you have to temporarily accept it as true! (that blew my mind)
When you try to correct misinformation, it usually doesn’t work. First of all, there is a competition between the new information and what has already been accepted as true. Furthermore, if the new information disrupts the coherent story created in peoples minds, it will be resisted. And finally, people just don’t like to be told what to think (even if it’s about something they don’t really care about).
Ideology and personal worldviews make things even worse. If the misinformation is in line with pre-existing beliefs and attitudes, people usually counter-argue any information that may challenge it. This usually just reinforces their reliance on the misinformation. It can get pretty bad too – when people are exposed to scientific evidence that threatens their beliefs, it can lead them to discount the scientific method itself (which is ironic, since the scientific method is designed to reveal information with the least amount of bias possible – it doesn’t care what you believe in, that’s the whole point).
What can we do about it?
The article ends with some great practical advice for correcting misinformation. They even provide a great figure to make the solutions more clear. Click on it to enlarge:
Recommendations for myth-busters:
- Correcting misinformation may create gaps in peoples mental models / stories. Fill in those gaps by providing an alternative story that includes the new information.
- Repeat the correction, but in a way that emphasizes the facts, rather than the myth (repeating the myth can strengthen it, so be careful).
- When repeating the myth, provide a warning first (“the following is a myth”) so people have their minds engaged in discounting the myth as they hear it.
- Make the rebuttal as simple and brief as possible – that makes it easier to accept.
- Foster healthy skepticism – people need to learn to use their “bullshit filters” and think critically about information.
When worldview is an issue (and it usually is), try these strategies:
- Present the new information in a way that affirms their worldview. By focusing on the potential benefits and opportunities the correction can bring to their agenda, it may be considered less threatening.
- Encourage self-affirmation (make people feel good about themselves). When a persons self esteem is intact, they may realize that myth debunking is an attack on the myth, not them.
While this information can be very useful to writers and bloggers who are trying to spread the truth, the public needs to be aware of these strategies as well. Why? Because the same strategies can be used to spread misinformation.
Be aware, and think critically.
1. Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and Its Correction Psychological Science in the Public Interest, 13 (3), 106-131 DOI: 10.1177/1529100612451018