Facebook to take action against users who repeatedly share misinformation

Facebook will reduce the distribution of all posts in its news feed from a user account if it frequently shares content that has been flagged as false. PHOTO: REUTERS

MENLO PARK (REUTERS, BLOOMBERG) - Facebook said on Wednesday (May 26) it would take "stronger" action against people who repeatedly share misinformation on the platform.

Facebook will reduce the distribution of all posts in its news feed from a user account if it frequently shares content that has been flagged as false by one of the company's fact-checking partners, the social media giant said in a blog post.

The company already does this for Pages and Groups that post misinformation, but it had not previously extended the same policy to individual users.

Facebook declined to specify how many times a user's posts have to be flagged before the new punishment kicks in.

The Menlo Park, California-based company will also start showing users a pop-up message if they click to "like" a page that routinely shares misinformation, alerting them that fact-checkers have previously flagged that page's posts.

"This will help people make an informed decision about whether they want to follow the Page," the company said.

False claims and conspiracies have proliferated on social media platforms, including Facebook and Twitter, during the Covid-19 pandemic.

"Whether it's false or misleading content about Covid-19 and vaccines, climate change, elections or other topics, we're making sure fewer people see misinformation on our apps," the company said in a statement.

Earlier this year, Facebook said it took down 1.3 billion fake accounts between October and December, ahead of an inspection by the United States House Committee on Energy and Commerce into how technology platforms are tackling misinformation.

Join ST's Telegram channel and get the latest breaking news delivered to you.