A different 'super spreader': Facebook struggles with election disinformation

Facebook has increased efforts to stop the spread of disinformation. PHOTO: REUTERS

SAN FRANCISCO (AFP) - The United States presidential election is finished: votes cast, the transition - though delayed - begun.

But on Facebook, the fight against election misinformation continues, thanks to "super spreaders" - accounts that disseminate rumours and fabrications, falsely spreading the idea that the 2020 election was beset by organised, extensive fraud by the Democratic Party.

The US non-profit Avaaz has identified 25 pages in particular, including those of Mr Donald Trump Jr and Mr Eric Trump - the President's sons - White House press secretary Kayleigh McEnany and combative conservative commentators Dan Bongino, Lou Dobbs and Rush Limbaugh, along with pro-Trump organisations such as Turning Point USA.

These are sowing doubt about the President-elect Joe Biden's White House win earlier this month - taking their lead from the building's current resident, who has also taken to social media to tweet that he will not "concede" and to outline his so-far unfounded claims that the election was "stolen".

Unproven allegations of fraud from these accounts have been "liked", commented on and shared more than 77 million times since Nov 3, according to a study from Avaaz.

And that doesn't take into account the Facebook accounts of the "super spreader"-in-chief, Mr Donald Trump himself, nor that of his former adviser Steve Bannon, which was recently removed by the network.

The social media giant has increased efforts to stop the spread of disinformation.

It restricted and in some cases banned the publication of some political ads, highlighted reliable sources of information and tackled foreign manipulation campaigns.

Going viral

Thanks to those measures and others, Facebook was able to avoid a repeat of the 2016 presidential campaign, when organised disinformation campaigns permeated the network ahead of Mr Trump's election.

But these efforts were not enough to stop run-of-the-mill rumour circulation.

"The super spreaders in this list, with the helping hand of Facebook's algorithm, are central to creating this flood of falsehoods that are now defining the political debate for millions across the country," explained Mr Fadi Quran, Avaaz campaign director.

Private Facebook groups have also contributed to the far-reaching spread of misinformation, according to Avaaz.

Such groups - often made up of Trump supporters or those who also believe his allegation of a "stolen" vote - have exploded in the aftermath of the election, Avaaz reported, and they can be difficult to monitor and manage.

Facebook on Nov 5 suspended a group called #StopTheSteal, which had attracted some 350,000 members in 48 hours.

"The false rumours about election fraud continue as they being passed through these networks. So it's less big accounts... it is more the millions of people who continue to push this narrative to one another," said Ms Claire Wardle, US director of the First Draft NGO.

Fact-checking

AFP works with Facebook's fact-checking programme in almost 30 countries and nine languages. Around 60 media outfits work worldwide on the programme.

Content rated "false" by fact-checkers is downgraded in news feeds so fewer people will see it.

If someone tries to share a post found to be misleading or false, Facebook presents them with the fact-checked article.

But Facebook has been widely criticised for its reluctance to take a more rigid stance, including by some employees, according to the US publication The Information.

According to an article published on Tuesday (Nov 24), the site in 2018 compiled a list of 112,000 government and political candidate accounts that should be exempt from verification efforts, but says it is unclear if the list remains active, and Facebook has not confirmed its existence.

The situation led to an internal outcry in the summer of 2019, The Information reported, with employees calling for an end to the Facebook policy that exempts politicians from the fact-checking programme.

They pointed to an internal study that showed that users were more likely to believe misinformation if it came from a politician.

But Facebook says the study's findings actually support their approach and helped them devise ways to call out politicians who share links or posts that have already been fact-checked.

That method allowed a warning to appear on a video shared by Mr Trump - showing Los Angeles election workers collecting ballots but which the President said showed them stealing the envelopes - explaining the post was "missing context" and that "the same information was checked in another post by independent fact-checkers".

"We don't believe it's appropriate for us to prevent a politician's speech from being subject to public scrutiny," said Facebook spokesman Joe Osborne.

Join ST's Telegram channel and get the latest breaking news delivered to you.