Most support MCI's proposed online harm rules, seek transparency on handling of flagged content

Many respondents expressed concerns over the same online harms that are the focus of the MCI's measures. ST PHOTO: KUA CHEE SIONG

SINGAPORE - Online users here want more transparency on why potentially harmful content flagged to social media services is sometimes left online without any update on the decision taken or whether there is an appeal process.

This was a key finding from a public consultation service of 600 parents, young people, industry groups and other respondents between July 13 and Aug 10.

Most respondents supported the Ministry of Communications and Information's (MCI) proposed measures to enhance online safety for social media users here, especially young people.

Under the proposed Code of Practice for Online Safety, online platforms will be required to put in place additional safeguards to protect young users from harmful online content. 

The second proposed code, Content Code for Social Media Services, will also give the Infocomm Media Development Authority (IMDA) powers to direct any social media platform to disable access to certain content for users in Singapore should such content slip through the cracks.

In findings published on Thursday, respondents mostly agreed with the proposal for social media platforms to reduce exposure to harmful online content for local users.

Many respondents expressed concerns over the same online harms that are the focus of MCI's measures, such as cyber bullying and explicit sexual content.

MCI said: "Parents expressed concern over viral social media content featuring dangerous pranks and challenges that could be copied by their children."

For example, a 10-year-old Italian girl died in 2021 after taking part in an online "blackout challenge" that encouraged users to choke themselves until they pass out.

Other respondents said they were concerned about harmful advertisements, online gaming, scams, misinformation and online impersonation.

Penalties should be imposed on services that do not comply, said some respondents, while others sought assurances the measures would not affect user privacy or freedom of expression.

MCI said: "We will need to find the right balance between prioritising user safety and managing privacy and freedom of expression."

It will continue to study other issues raised, even as the current focus is on tackling harmful online content that appear on social media services, it added.

Many respondents were not aware of existing safety features on social media services, while some parents said they lacked the knowledge to guide their children to use social media services safely.

They urged platforms to raise user awareness and usage of safety features on their services.

In some cases, content reported to social media platforms was not dealt with and left online, said respondents who shared their experiences.

They supported MCI's proposal for social media services to release annual reports on the effectiveness of their content moderation policies and practices to tackle harmful content, which is thought to be a way to hold the firms accountable.

MCI added that social media platforms should make it accessible and effective for users to flag content.

Industry groups sought clarification on which social media services will fall under the proposed rules and how these firms are defined.

A firm's size and business model should be considered in implementing the proposed rules, they said, adding that they hoped services designed primarily for enterprise use can be excluded.

MCI agreed and said services will be given flexibility to develop appropriate ways to tackle harmful content, taking into account their unique operating models.

In response to IMDA being granted authority to direct platforms to take down egregious content, industry groups and other respondents said explanations should be provided on why the specific content is harmful.

MCI replied that concerns over egregious content will be made clear to the services when issuing such orders.

Academics suggested that social media platforms share data with the research community to facilitate studies on the extent of harmful content in Singapore, so that community standards can be based on local data and findings.

Other respondents suggested the Government could also set up advisory panels with experts and appointed members of the public, who could raise public feedback and concerns on online issues to social media platforms and the authorities.

MCI said: "We will continue to work closely with stakeholders in the community and industry as well as public sector to equip Singaporeans with the knowledge and skills to keep themselves and their loved ones safe online.

"We also continue to welcome feedback and suggestions to enhance online safety for users in Singapore."

Join ST's Telegram channel and get the latest breaking news delivered to you.