MENLO PARK (REUTERS, BLOOMBERG) - Facebook said it removed over three dozen pages spreading misinformation about Covid-19 vaccines, after the White House called on social media firms to tighten controls on pandemic-related facts shared on their platforms.
Companies like YouTube, Twitter and Google have come under fire from the Biden administration for the alarming spread of vaccine misinformation that is hitting the pace of inoculation in the United States where many are hostile to being vaccinated.
A recent report from the Center for Countering Digital Hate (CCDH) showed 12 anti-vaccine accounts are spreading nearly two-thirds of anti-vaccine misinformation online.
Facebook disputed the methodology behind the report, but said on Wednesday (Aug 18) it removed over three dozen pages, groups and Facebook or Instagram accounts linked to these 12 people for violating its policies.
"We have also imposed penalties on nearly two dozen additional Pages, groups or accounts linked to these 12 people,"Facebook said in a blogpost titled "How we're taking action against vaccine misinformation superspreaders".
Some of the main pieces of vaccine misinformation the Biden administration is fighting include that the Covid-19 vaccines are ineffective, false claims that they carry microchips and that they hurt women's fertility, a White House official had said last month.
Facebook also said it has removed more than 20 million posts on its main social network and photo-sharing app Instagram for violating rules on Covid-19 misinformation since the beginning of the pandemic.
The company said it added information labels to more than 190 million Covid-19-related posts on Facebook that third-party fact-checking partners had rated as false or missing context. The data covers actions taken through June.