NEW YORK (NYTIMES) - Facebook unveiled a series of changes on Tuesday (Sept 17) to limit hate speech and extremism on its site, as scrutiny is rising on how the social network may be radicalising people.
The company began its announcements by saying it would expand its definition of terrorist organisations, adding that it planned to deploy artificial intelligence to better spot and block live videos of shootings.
Hours later, in a letter to the chairman of a House panel, Facebook said it would prevent links from fringe sites 8chan and 4chan from being posted on its platform.
And late in the day, it detailed how it would develop an oversight board of 11 members to review and oversee content decisions.
Facebook, based in Silicon Valley, revealed the changes a day before the Senate Commerce Committee will question the company, Google and Twitter on Capitol Hill about how they handle violent content.
On Tuesday, a bipartisan group of congressmen also sent a letter to Twitter, Facebook and YouTube about the presence of international terrorist organisations on the sites and how those groups foment hate.
Some experts who study extremism online welcomed Facebook's expanded effort, especially the broader definition of terrorism. But they emphasised that the plan's effectiveness would depend on the details - where Facebook draws the line in practice and how the company reports on its own work.
In the last two years, the company said, it has detected and deleted 99 per cent of extremist posts - about 26 million pieces of content - before they were reported to it.
Facebook said it would now consider people and organisations that engaged in attempts at violence toward civilians as terrorists, as opposed to its old way of defining terrorism by focusing on violent acts intended to achieve political or ideological goals.
In a letter Tuesday to Rep. Max Rose of New York, chairman of the subcommittee on intelligence and counterterrorism of the House Committee on Homeland Security, Facebook also said it was "blocking links to places on 8chan and 4chan that are dedicated to the distribution of vile content."
Inside Facebook, the company has additionally been developing an oversight board for more than a year.
Members will oversee and interpret how Facebook's existing community standards are enforced by its content moderators, can instruct Facebook to allow or remove content, and will be asked to uphold or reverse designations on content removals.