Facebook reiterates safety commitment

Facebook said it has beefed up its internal safety and security team globally. PHOTO: REUTERS

Facebook sought to reassure the public of its commitment to keep its users safe yesterday in the wake of the ban and suspension of outgoing United States President Donald Trump's social media accounts by big tech firms.

This comes after the storming of the US Capitol in Washington last week by rioters, which left at least five people dead.

In a press conference here, Facebook reiterated it has beefed up its internal safety and security team globally and is working with external partners on policies to determine if content that violates its community standards poses a risk of real-world harm.

"We are committed to keep our community of users safe. For our three billion users around the world, creating a safe environment is really core to our business model," said Ms Clara Koh, Facebook's head of public policy for Singapore and Asean.

She said the social media firm invested over US$3.7 billion (S$4.9 billion) in safety and security in 2019. The firm made a US$7.8 billion profit in the third quarter of last year.

Since 2016, the company has tripled its hiring of staff in this area to over 35,000 globally, of which 15,000 are content reviewers hired for their language abilities and awareness of cultural nuances. These include moderators in Singapore who cover the major languages here. Facebook declined to give a local breakdown. "If people don't feel safe when using our services, then they're going to stop using the services," Ms Koh said, adding that Facebook is continuing to invest in safety. It also has tools that, for instance, alert people about where they can get safety information from when they search for topics linked to harmful behaviour.

Her comments came after Mr Trump's access to many social media accounts was cut off for stirring unrest that led to the attack on the Capitol.

Asked where it would draw the line on content that could incite violence, Facebook reiterated yesterday that its community standards for creating a safe environment for users apply to all on the platform, including political leaders. In Mr Trump's case, it was decided that there was a risk of public harm so Facebook banned his account.

What constitutes a risk of imminent physical harm is determined by Facebook's specialist intelligence teams that have a background in security, and external partners, including safety experts and community organisations. They also guide the firm on the action it takes against content that poses such a risk.

It said this is even as the firm tries to ensure that speech is not being censored and that people can hear from their political leaders, while ensuring the safety and well-being of the broader community.

Two parties which Facebook works with here are Touch Community Services and Samaritans of Singapore, on helping the elderly with privacy and security, and emotional support, respectively.

Join ST's Telegram channel and get the latest breaking news delivered to you.

A version of this article appeared in the print edition of The Straits Times on January 16, 2021, with the headline Facebook reiterates safety commitment. Subscribe