The Straits Times says

Facing up to duty of care for content

New: Gift this subscriber-only story to your friends and family

Recent disclosures about the way Facebook goes about its content moderation duties make a strong case for better oversight of platforms that host public discourse. Broadly, the claim is that the social media giant has been placing its own growth objectives ahead of its responsibility to protect users from misinformation, hate speech and other harmful content because such posts grab attention and engage viewers, translating into higher profits. Concerns that such practices are undermining the well-being of individuals and polarising societies were revealed in tens of thousands of leaked company documents and testimonies by a whistle-blower.

Facebook appears not to treat equally all its nearly three billion active users across the world. One leaked document showed that last year, the company allocated 87 per cent of its budget for developing misinformation detection algorithms to its home country, the United States, and just 13 per cent to the rest of the world. While multinationals routinely do different things in different markets, such decisions have had consequences. In Myanmar, where Facebook is the primary source of news for millions, the lack of active moderation compromised Facebook's ability to take down false posts. It is feared that this may have helped fan the flames of the Feb 1 coup.

Already a subscriber? 

Read the full story and more at $9.90/month

Get exclusive reports and insights with more than 500 subscriber-only articles every month

Unlock these benefits

  • All subscriber-only content on ST app and straitstimes.com

  • Easy access any time via ST app on 1 mobile device

  • E-paper with 2-week archive so you won't miss out on content that matters to you

Join ST's Telegram channel and get the latest breaking news delivered to you.