Facebook rolls out new moves to limit extremism

SPH Brightcove Video
Representatives from Facebook, Twitter and Google told the Senate Commerce Committee Wednesday, they are investing in new technologies to reduce and remove threats of extremist and violent content faster.

NEW YORK • Facebook has unveiled a series of changes to limit hate speech and extremism on its website, as scrutiny rises on how the social network may be radicalising people.

The company began its announcements on Tuesday by saying it would expand its definition of terrorist organisations, adding that it planned to deploy artificial intelligence to better spot and block live videos of shootings.

Hours later, in a letter to the chairman of a House panel in the United States, Facebook said it would prevent links from fringe sites 8chan and 4chan from being posted on its platform.

And later in the day, it detailed how it would develop an oversight board of 11 members to review and oversee content decisions.

Facebook, based in Silicon Valley, revealed the changes a day before the Senate Commerce Committee will question the company, Google and Twitter on Capitol Hill about how they handle violent content.

On Tuesday, a bipartisan group of congressmen also sent a letter to Twitter, Facebook and YouTube about the presence of international terrorist organisations on the sites and how those groups foment hate.

Some experts who study extremism online welcomed Facebook's expanded effort, especially the broader definition of terrorism. But they emphasised that the plan's effectiveness would depend on the details - where Facebook draws the line in practice and how the company reports on its own work.

In the last two years, the company said, it has detected and deleted 99 per cent of extremist posts - about 26 million pieces of content - before they were reported to it.

Facebook said it would now consider people and organisations that engaged in attempts at violence towards civilians as terrorists, as opposed to its old way of defining terrorism by focusing on violent acts intended to achieve political or ideological goals.

In a letter on Tuesday to Representative Max Rose of New York, chairman of the subcommittee on intelligence and counter-terrorism of the House Committee on Homeland Security, Facebook also said it was "blocking links to places on 8chan and 4chan that are dedicated to the distribution of vile content".

Inside Facebook, the company has additionally been developing an oversight board for more than a year.

Members will oversee and interpret how Facebook's existing community standards are enforced by its content moderators, can instruct Facebook to allow or remove content, and will be asked to uphold or reverse designations on content removals.

NYTIMES

Join ST's Telegram channel and get the latest breaking news delivered to you.

A version of this article appeared in the print edition of The Straits Times on September 19, 2019, with the headline Facebook rolls out new moves to limit extremism. Subscribe