Facebook intercepts some graphic content; internal guidelines on sex and violence revealed

A person using a computer displaying the Thai Facebook login page in Bangkok, Thailand. PHOTO: EPA

Facebook is using software to intercept some graphic content before it got on the site, The Guardian said in a report on Sunday (May 21).

The British paper said it saw more than 100 internal training manuals, spreadsheets and flowcharts that Facebook has used to "moderate issues such as violence, hate speech, terrorism, pornography, racism and self-harm".

Facebook told The Guardian that while it was using software to intercept some graphic content, "we want people to be able to discuss global and current events... so the context in which a violent image is shared sometimes matters".

For example, photos of animal abuse can be shared because they can raise awareness, and allow people to condemn the abuse, but "some extremely disturbing imagery may be marked as disturbing".

The social media giant's guidelines also said that Facebook will allow people to live-stream attempts to self-harm because it "doesn't want to censor or punish people in distress".

This month, an Austin musician doused himself in kerosene and set himself on fire on Facebook Live. Reports said this was an attempt to get back at his ex-girlfriend, whom he was convicted of abusing.

The musician died, and while Facebook has taken down the video, it is still circulating on the Internet.

Facebook was criticised for taking two weeks to remove a video that showed a 12-year-old girl live-streaming her suicide in January. In April, it took about 24 hours for it to remove videos of a father killing his child in Thailand.

But in a case that sparked an outcry because Facebook removed an image, an iconic Vietnam war photo was removed because the girl in it was naked.

Facebook has since updated its rules to make exceptions for newsworthy content.

Graphic content is a headache for the technology company which some argue has become one of the largest media companies in the world but has little control over what is published on its platform.

This month, Facebook announced plans to nearly double the number of workers tasked with monitoring Facebook Live videos. The company said the boost aims to stop violent live streams before they're able to go viral, the Washington Post reported.

But this may not be enough to comb through the network's vast mountain of content.

"Facebook cannot keep control of its content," one source told The Guardian. "It has grown too big, too quickly."

Its large user base of nearly 2 billion also means that it is hard to find consensus on content guidelines.

The rules on violence, for instance, allows threats like "I'm going to kill you" or "F*** off and die", which it deems as not credible, but are seen as "a violent expression of dislike and frustration".

Remarks such as "To snap a b***h's neck, make sure to apply all your pressure to the middle of her throat" are also allowed, but not "Someone shoot Trump".

The latter will be deleted because as a head of state, US President Donald Trump is in a protected category.

The guidelines added: "People commonly express disdain or disagreement by threatening or calling for violence in generally facetious and unserious ways."

Violent language is not considered credible until there is "reasonable ground" to believe there is a "a transition to a plot or design".

Facebook's guidelines allowed that "not all disagreeable or disturbing content violates our community standards".

Ms Monika Bickert, ‎Facebook's head of global policy management, told The Guardian that Facebook was "a new kind of company. It's not a traditional technology company. It's not a traditional media company. We build technology, and we feel responsible for how it's used. We don't write the news that people read on the platform."

Other rules or guides provided to the moderators highlighted by The Guardian:

Videos of violent deaths, while marked as disturbing, do not always have to be deleted because they can help create awareness of issues such as mental illness.

Some photos of non-sexual physical abuse and bullying of children do not have to be deleted unless there is a sadistic or celebratory element.

All "handmade" art showing nudity and sexual activity is allowed but digitally made art showing sexual activity is not.

Videos of abortions are allowed, as long as there is no nudity.

Anyone with more than 100,000 followers on a social media platform is designated as a public figure which denies them the full protections given to private individuals.

Join ST's Telegram channel and get the latest breaking news delivered to you.