Facebook emphasises safety commitment in the wake of Trump social media ban

Facebook said it has beefed up its internal safety and security team globally. PHOTO: REUTERS

SINGAPORE - Facebook sought to reassure the public its commitment to keep its users safe on Friday (Jan 15) in the wake of the ban and suspension of outgoing United States President Donald Trump's social media accounts by big tech firms.

This comes after the storming of the Capitol in Washington last week by rioters, which left at least five people dead.

In a media briefing here, Facebook reiterated it has beefed up its internal safety and security team globally and is working with external partners on policies to determine if content that violates its community standards poses a risk of real-world harm.

"We are committed to keep our community of users safe. For our 3 billion users around the world, creating a safe environment is really core to our business model," said Ms Clara Koh, Facebook's head of public policy for Singapore and Asean.

She said the social media firm invested over US$3.7 billion (S$4.9 billion) in safety and security in 2019. The firm made a US$7.8 billion profit in the third quarter of last year.

Since 2016, the company tripled its hiring of staff in this area to over 35,000 globally, of which 15,000 are content reviewers who are hired for their language abilities and awareness of cultural nuances. This includes moderators in Singapore who cover the major languages here. Facebook declined to give a local breakdown.

"If people don't feel safe when using our services, then they're going to stop using the services," Ms Koh said, adding that Facebook is continuing to invest in safety. It also has tools that, for instance, alert people about where they can get safety information when they search for topics linked to harmful behaviour.

Her comments came after Mr Trump's access to many social media accounts was cut off for his stirring unrest that led to the attack on the Capitol. He has been banned from Twitter, Facebook, Instagram, Twitch and Snapchat. Google's YouTube has suspended his channel temporarily.

Asked where it would draw the line on content that could incite violence, Facebook reiterated on Friday that its community standards for creating a safe environment for users apply to all on the platform, including political leaders. In Mr Trump's case, it was decided that there was a risk of public harm, so Facebook banned his account.

What constitutes a risk of imminent physical harm is determined by Facebook's specialist intelligence teams that have a background in security, and external partners Facebook works with, including safety experts and community organisations. They also guide the company on the action it takes against content that poses such a risk.

It said this is even as the firm tries to ensure that speech is not being censored and that people can hear from their political leaders, while ensuring the safety and well being of the broader community.

"The external partners are extremely important," said Ms Koh.

Two parties which Facebook works with here are Touch Community Services and Samaritans of Singapore, on helping the elderly with privacy and security, and emotional support respectively.

Join ST's Telegram channel and get the latest breaking news delivered to you.