Social media platforms to remove harmful content, add safeguards under S'pore's proposed rules

Remote video URL

SINGAPORE - Social media platforms like Facebook, TikTok and Twitter will soon be legally required to implement safety standards and content moderation processes to minimise users' risk of exposure to harmful online content like terrorist propaganda, under Singapore's new set of proposed Internet rules.

They will also need to ensure additional safeguards for users who are under 18 years old, including tools to help them or their parents minimise their exposure to inappropriate content such as sexual or violent videos, and unwanted interactions like online stalking and harassment.

Minister for Communications and Information Josephine Teo announced some details of the proposed new rules in a Facebook post on Monday (June 20).

"There is a growing global movement pushing to enhance online safety, recognising harms come along with the good when people engage on social media," she said.

"Many countries have enacted or are in the process of enacting laws to protect users against online harms."

Mrs Teo said Singapore's preferred approach in strengthening its online regulatory approach is to do so in a consultative and collaborative manner.

"This means learning from other countries' experiences, engaging tech companies on the latest tech developments and innovations, and understanding our people's needs.

"These will allow us to develop requirements that are technologically feasible, can be effectively enforced and that are fit for our purpose."

During a media briefing on Monday, the Ministry of Communications and Information (MCI) said it has been conducting consultations with the tech industry since earlier this month, and public consultations will begin next month.

The new Code of Practice for Online Safety and the Content Code for Social Media Services are aimed at codifying these standards in law and giving the authorities powers to take action against platforms that fail to meet the requirements.

The codes are expected to be added to the Broadcasting Act following the consultations.

If passed, the Infocomm Media Development Authority (IMDA) will be empowered to direct social media services to disable access to harmful online content for Singapore users.

Examples of content that could be blocked under the new codes include live-stream videos of mass shootings and viral social media “challenges” that encourage young people to perform dangerous stunts like holding their breath until they pass out.

The codes will also take into account Singapore’s unique context and sensitive issues like race and religion.

For instance, it could cover incidents similar to a previous case where a man was charged for stoking racial tensions after he posted racially offensive tweets using the persona of a Chinese woman with the pseudonym Sharon Liew.

Other examples cited by the MCI included a 2020 post by a person using a profile called “NUS Atheist Society” which depicted the Bible and the Quran in an offensive manner and a 2021 poll asking people to rank local female Muslim religious teachers according to their sexual attractiveness.

Platforms will also be required to produce annual accountability reports to be published on the IMDA website.

These reports will need to include metrics to show the effectiveness of their systems and processes.

Asked what other consequences errant platforms could face, an MCI spokesman said it is too early to give details as the specifics are still being developed in collaboration with the tech industry.

The codes were first mentioned during the debate on its budget in March.

Remote video URL

Mrs Teo told Parliament the codes will focus on three areas: child safety, user reporting and platform accountability.

She also said MCI is working with the Ministry of Home Affairs to provide Singaporeans with more protection from illegal activities carried out online.

This includes strengthening Singapore's laws to deal with illegal online content such as terrorist materials, child pornography, scams and content that incites violence.


A safer internet

User safety

• Have community standards and content moderation mechanisms to mitigate users’ exposure to sexual, violent and self-harm-related content.

• Provide tools for users to reduce their exposure to such content.

• Proactively detect and remove terrorism-related content and child sexual exploitation and abuse material.

Reporting

• Allow users to report harmful content and unwanted interactions.

• Assess the reports and take appropriate action in a timely manner.

• Ensure reporting channels are permanently available, and easy to access and use.

Accountability

• Produce an annual accountability report to be published on the Infocomm Media Development Authority’s website.

• Include metrics in the reports to show the effectiveness of the systems and processes in place for users in Singapore.

Join ST's Telegram channel and get the latest breaking news delivered to you.