LONDON (BLOOMBERG) - British Home Secretary Amber Rudd warned Facebook, Google, and Twitter to improve monitoring of extremist and hate content after a panel of lawmakers urged her to consider making the hosting of such material a crime.
A report from Parliament's cross-party Home Affairs Committee, to be published on Monday (May 1), said the companies are "shamefully far" from having done enough to deal with illegal and dangerous content and was scornful of claims that there is little more they can do. The panel also said that it is "shocking" that Google's YouTube subsidiary allows paid advertising to appear alongside videos created by terrorist groups.
The report comes days after the British government protested a decision by Twitter to stop letting security officials track terrorist-related posts. While Ms Rudd stopped short of agreeing with the committee's recommendation to look at legislation, lawmakers are becoming more confrontational with Internet giants.
"We have made it very clear that we will not tolerate the Internet being used as a place for terrorists to promote their vile views, or use social media platforms to weaponize the most vulnerable people in our communities," Ms Rudd said in an e-mailed statement.
"We will continue to push the Internet companies to make sure they deliver on their commitments."
The committee's inquiry was sparked by the murder of House of Commons lawmaker Jo Cox during the referendum campaign last year on leaving the European Union. It examined online material and considered whether it might be encouraging hate crime.
Among its conclusions were that YouTube is "awash" with racist material. It said the company refused to remove a video by David Duke, a former grand wizard of the Ku Klux Klan, titled "Jews admit organizing White Genocide", saying it "did not cross the line into hate speech".
Twitter hosted "significant numbers of racist and dehumanising tweets that were plainly intended to stir up hatred," the panel said. The company removed many of them after the lawmakers complained, but it refused to delete a cartoon showing "ethnic minority migrants tying up and abusing a semi-naked white woman, while stabbing her baby to death", because it didn't breach its "hateful conduct policy", the report said.
Facebook had community pages "devoted to stirring up hatred, particularly against Jews and Muslims", the panel found. The company removed some posts that the committee highlighted, but said the community pages weren't violations.
The report criticises the companies for relying on others to report offensive material, "outsourcing the vast bulk of their safeguarding responsibilities at zero expense". It noted that soccer teams are required to pay the costs of policing matches, and urged the government to consult on whether Internet companies should have to pay the cost of policing their sites.
"The major social media companies are big enough, rich enough and clever enough to sort this problem out - as they have proved they can do in relation to advertising or copyright," the report concluded. "It is shameful that they have failed to use the same ingenuity to protect public safety and abide by the law as they have to protect their own income."
Facebook said it has "quick and easy" ways for users to report content and "nothing is more important" to the company than people's safety.
"We agree with the committee that there is more we can do to disrupt people wanting to spread hate and extremism online," Mr Simon Milner, Facebook's director of policy, said in an e-mail.
"We are working closely with partners, including experts at Kings College, London, and at the Institute for Strategic Dialogue, to help us improve the effectiveness of our approach."
Alphabet and Twitter didn't immediately respond to e-mails outside of normal business hours requesting comment on the report.