5 types of online harm Singapore aims to tackle

The new codes will take Singapore's unique context into account, including sensitive issues like race and religion. PHOTO: ST FILE

SINGAPORE - Social media platforms will soon be compelled by law to address harms faced by Internet users in Singapore. Here are five types of online harm the Code of Practice for Online Safety and the Content Code for Social Media Services could cover.

1. Violence and terrorism

Social media platforms will need to proactively detect and remove violent videos or other content that promotes violence, such as terrorist propaganda.

One example is an incident in 2019, in which a gunman stormed a mosque in Christchurch, New Zealand, and fired on Muslim worshippers while live-streaming the terrorist attack on Facebook using a helmet-mounted camera. Clips of the footage quickly made their way onto other platforms like Twitter.

In another incident, rioters who stormed Capitol Hill in the United States last year used social media to organise themselves and amplify their messages.

Similar material that incites violence, as well as content that encourages self-directed violence such as suicide, will also be covered under the new codes of practice.

2. Dangerous viral challenges

Social media "challenges" that encourage dangerous behaviour are another example of harmful online trends that the codes aim to reduce.

Last year, a 10-year-old Italian girl died after taking part in an online "blackout challenge" which encouraged users to choke themselves until they pass out.

Some platforms have taken steps to remove such challenges.

Viral video platform TikTok blocked hashtags and videos related to the "milk crate challenge" last year over concerns that participants could be seriously injured. The trend involved users filming themselves stacking milk crates into a tower and then climbing over them. Many videos showed users falling to the ground while attempting the stunt.

3. Sexual exploitation, abuse and harassment

The codes will also aim to minimise users' risk of exposure to sexual content and abuse, including child pornography. Platforms will be required to detect and remove child sexual exploitation and abuse material, as well as content that promotes sexual violence.

Sexual harassment, online stalking and threats of "revenge porn" - or non-consensual sharing of sexual images - will also be tackled. Social media platforms will be required to ensure users can easily report such unwanted interactions, assess the reports and take appropriate action in a timely manner.

One example of sexual harassment is a poll circulated on social media last year inviting people to rank local female asatizah (Muslim religious teachers) according to their sexual attractiveness.

4. Threats to public health

Content that threatens public health could also fall afoul of the new rules.

During the Covid-19 pandemic, conspiracy theories and viral social media posts may have contributed to vaccination hesitancy and encouraged people not to heed measures implemented to control the spread of the virus.

Similar posts could constitute a threat to public health and may be covered under the rules to combat online harm.

5. Threats to racial and religious harmony

The new codes will take Singapore's unique context into account, including sensitive issues like race and religion. Offensive content or incidents that could stoke racial or religious tension will be covered.

One example is a case where a man was charged for stoking racial tensions after he posted racially offensive tweets using the persona of a Chinese woman with the pseudonym Sharon Liew.

Another is a 2020 post by a person using a profile called "NUS Atheist Society" which depicted the Bible and the Quran as alternatives to be used in the event of a toilet paper shortage.

Join ST's Telegram channel and get the latest breaking news delivered to you.