Google finds 'inoculating' people against misinformation helps blunt its power

This different approach, called "pre-bunking", tries to undermine misinformation before people see it. PHOTO: AFP

SAN FRANCISCO (NYTIMES) - In the fight against online misinformation, falsehoods have key advantages: They crop up fast and spread at the speed of electrons, and there is a lag period before fact-checkers can debunk them.

So researchers at Google, the University of Cambridge and the University of Bristol tested a different approach that tries to undermine misinformation before people see it. They call it "pre-bunking".

The researchers found that psychologically "inoculating" internet users against lies and conspiracy theories - by pre-emptively showing them videos about the tactics behind misinformation - made people more sceptical of falsehoods afterwards, according to an academic paper published in the journal Science Advances on Wednesday (Aug 24).

But effective educational tools still may not be enough to reach people with hardened political beliefs, the researchers found.

Since Russia spread disinformation on Facebook during the 2016 election, major technology companies have struggled to balance concerns about censorship with fighting online lies and conspiracy theories.

Despite an array of attempts by the companies to address the problem, it is still largely up to users to differentiate between fact and fiction.

The strategies and tools being deployed during the midterm vote in the United States this year by Facebook, TikTok and other companies often resemble tactics developed to deal with misinformation in past elections: partnerships with fact-checking groups, warning labels, portals with vetted explainers as well as post removal and user bans.

Social media platforms have made attempts to pre-bunk before, though those efforts have done little to slow the spread of false information. Most have also not been as detailed - or as entertaining - as the videos used in the studies by the researchers.

Twitter said this month it would try to "enable healthy civic conversation" during the midterm elections in part by reviving pop-up warnings, which it used during the 2020 election.

Warnings, written in multiple languages, will appear as prompts placed atop users' feeds and in searches for certain topics.

The new paper details seven experiments with almost 30,000 total participants. The researchers bought YouTube ad space to show users in the United States 90-second animated videos aiming to teach them about propaganda tropes and manipulation techniques. One million adults watched one of the ads for 30 seconds or longer.

The users were taught about tactics such as scapegoating and deliberate incoherence, or the use of conflicting explanations to assert that something is true, so that they could spot lies.

Researchers tested some participants within 24 hours of seeing a pre-bunk video, and found a 5 per cent increase in their ability to recognise misinformation techniques.

One video opens with a mournful piano tune and a little girl grasping a teddy bear, as a narrator says "what happens next will make you tear up". Then, the narrator explains that emotional content compels people to pay more attention than they otherwise would, and that fearmongering and appeals to outrage are keys to spreading moral and political ideas on social media.

The video offers examples, such as headlines that describe a "horrific" accident instead of a "serious" one, before reminding viewers that if something they see makes them angry, "someone may be pulling your strings".

Beth Goldberg, one of the paper's authors and the head of research and development at Jigsaw, a technology incubator within Google, said in an interview that pre-bunking leans into people's innate desire to not be duped.

"This is one of the few misinformation interventions that I've seen at least that has worked not just across the conspiratorial spectrum, but across the political spectrum," Goldberg said.

Groups focused on information literacy and fact-checking have employed various pre-bunking strategies, such as a misinformation-identifying curriculum delivered over two weeks of texts, or lists of bullet points with tips such as "identify the author" and "check your biases".

Online games with names like Cranky Uncle, Harmony Square, Troll Factory and Go Viral try to build players' cognitive resistance to bot armies, emotional manipulation, science denial and vaccine falsehoods.

Tech companies, academics and non-governmental organisations fighting misinformation have the disadvantage of never knowing what lie will spread next.

But professor Stephan Lewandowsky of the University of Bristol, a co-author of Wednesday's paper, said propaganda and lies were predictable, nearly always created from the same playbook.

"Fact-checkers can only rebut a fraction of the falsehoods circulating online," Lewandowsky said in a statement. "We need to teach people to recognise the misinformation playbook, so they understand when they are being misled."

Join ST's Telegram channel and get the latest breaking news delivered to you.