Short videos explaining the manipulation techniques commonly found in online misinformation may make people less susceptible to it, according to new research.
The study, published in Science Advances, tested whether short videos covering five common logical fallacies and manipulation strategies, such as false dichotomies and emotional language, could improve people’s ability to recognise them and discern trustworthy from untrustworthy content. When their laboratory trials were replicated in the “real world” through a YouTube ad campaign, people’s ability to recognise some of these techniques increased by about 5% on average.
The SMC asked experts to comment on this research.
Dr Jagadish Thaker, Senior Lecturer, University of Auckland, comments:
“We are facing several political and social crises due to the spread of online misinformation. It has impacted our ability to fight the COVID-19 pandemic and has increased social strife. So far, education campaigns about misinformation and social media giants’ efforts to tackle online misinformation have had a mixed impact. As a result, finding new ways to help people build resilience against misinformation and disinformation that is also easily scalable and cost-effective is essential.
“Past research shows that it is challenging to change our minds once exposed to misinformation. Instead, prebunking or inoculating people with some of the standard techniques used to manipulate information, before being exposed to one, can be more effective in building public resilience against misinformation.
“Just like a vaccine, inoculation theory argues that we can build resistance to misinformation through prior exposure to weak doses of misinformation. Think of it as building a neural memory, akin to muscle memory, to spot misinformation.
“Using well-designed lab experiments and a real-world experiment on YouTube, researchers at the University of Cambridge and their colleagues found that participants could recognise commonly used manipulation techniques after exposure to prebunking videos. These prebunking videos first warned about an impending misinformation attack, informed the participants about the manipulation technique, and finally provided a funny ‘microdose’ of the misinformation technique.
“They tested five commonly used misinformation techniques: the use of emotionally charged manipulative language to evoke a strong response, incoherence, false dichotomies, scapegoating, and attacking a person instead of discussing ideas.
“Exposure to such inoculation videos also helped boost confidence in spotting misinformation techniques. It helped people to judge the content better and be more careful about sharing it online. This inoculation was effective for everyone, indicating that such a campaign is likely to be effective despite differences such as age, gender, political beliefs, use of social media, numeracy skills, and others. It is an impressive finding.
“Moreover, it cost the researchers as little as $0.05 per view on YouTube, indicating that running such campaigns can be a cost-effective policy and easily implementable at scale.
“However, we need more research to check how long such misinformation-spotting antibodies last in our minds. Just like new variants of COVID-19 have made vaccines less effective, do new variants of misinformation or disinformation make the previous inoculation less effective?
“All examples used in this research were non-political and fictitious. People may respond differently to misinformation about topics they hold close to their hearts, such as politics, religion, and ‘our way of life’. It is a US-based study, and it is essential to test the findings in other countries, paying attention to the role of state and non-state actors fueling misinformation in society.
“Government and local institutions could learn and apply such communication campaigns to help the public be alert to misinformation. Social media companies should equally share this responsibility in an era of hyper-individualized social media use. As the researchers rightly note, social media giants should share their data with researchers to help develop evidence-based policies on misinformation.”
No conflicts of interest.
Associate Professor Stephen Hill, School of Psychology, Massey University, comments:
“In a series of seven studies, including one that used an ad campaign on YouTube, researchers showed that ‘psychological inoculation’ against common manipulation techniques (such as the use of logical fallacies and emotionally manipulative language) reduced people’s willingness to share untrustworthy social media content and increased their ability to detect it.
“This research continues the excellent work carried out by this team which has led to creation of a number of useful resources for understanding and preventing the spread of harmful misinformation. It provides some hope that it may be possible to reverse, what many people perceive to be, the recent increase in the production and sharing of misinformation. As the 2021 report from Te Mana Whakaatu (the Classification Office) shows, New Zealand is in no way immune to these international trends.
“What the research doesn’t tell us is whether improving people’s ability to detect the use of manipulative techniques will reduce the likelihood that their beliefs will be swayed by (or reinforced by) the untrustworthy content of the message. You’d hope that they would, but other research shows that people are often more critical of the quality of arguments for messages that are contrary to their existing beliefs than for those that align with them. The good news is that if inoculation reduces the sharing of misinformation there will be less opportunity for other people to be swayed by it. As is often the case with preventive medicine, the challenge will be to persuade people to get inoculated.”
No conflicts of interest.