Tighter control of ads on younger teens’ social media accounts: IMDA

Sign up now: Get ST's newsletters delivered to your inbox

Accounts that belong to users under 18 years old should be free of advertisements, and platforms must submit online safety reports annually.

Accounts that belong to users under 18 years old should be free of advertisements, and platforms must submit online safety reports annually.

PHOTO: UNSPLASH

Follow topic:

SINGAPORE – A new code of practice will require social media platforms to promptly inform users of actions taken on their reports of online harms. This is in response to feedback from users here that they are often left in the dark after submitting their reports.

Advertisements that could have a harmful effect on young users’ mental health should be kept away from them, and platforms must submit online safety reports annually for publication on the Infocomm Media Development Authority’s (IMDA) website.

These requirements, which take effect on Tuesday, are among the dos and don’ts for social media platforms here, under the online safety code of practice that IMDA announced on Monday.

The designated social media services named in the code of practice are Facebook, HardwareZone, Twitter, TikTok, Instagram and YouTube, said IMDA, adding that Singapore is one of the first jurisdictions in the world to introduce laws for platforms to take preventive measures to ensure online safety.

The code of practice sets in stone how popular platforms should operate here after the

Online Safety (Miscellaneous Amendments) Act

took effect in February. 

The law gives the authorities the power to direct social media platforms to remove online harms such as sexual and violent content, content that promotes cyber bullying, vice, organised crime, suicide or self-harm, and content that may incite racial or religious tensions or endanger public health.

Failure to comply may attract a fine of up to $1 million, or a direction to have their social media services blocked here.

The regulations come amid a crackdown on online harms across app stores, social media platforms and messaging apps.

Under the code of practice, each platform must establish its own community guidelines that clearly state what content is allowed and not allowed on its services, said IMDA.

These rules should be enforced through effective content moderation, including the removal of content that violates its own community standards and blocking or banning users who break the rules.

Using technology and other processes, the social media service must minimise users’ exposure to any content related to child sexual exploitation and abuse, or content promoting terrorism, said IMDA.

Users should also be given tools to manage their own safety, like the option to hide harmful content and unwanted interactions, and limit location sharing and the visibility of their accounts to other users.

Each platform must also create separate community guidelines for younger users, along with moderating content and providing online safety information that they can easily understand.

“Accounts belonging to children must not receive advertisements, promoted content and content recommendations that designated social media services are reasonably aware to be detrimental to children’s physical or mental well-being,” said IMDA.

This could apply to advertisements that involve alcohol or body-modification and weight loss products, The Straits Times understands.

Platforms are also required to include tools that allow children or their parents to manage their safety on these services, and mechanisms for users to report harmful content and unwanted interactions.

Parents and guardians must also be given tools to manage the content that their children can see, the public visibility of their accounts and permissions for who can contact and interact with them.

Users who use high-risk terms related to self-harm or suicide must be actively offered local safety information that is easy to understand. This includes safety resources or information on support centres.

Each social media service is expected to have effective and easy-to-use reporting mechanisms to flag harmful content or unwanted interactions.

The platform should take appropriate action on user reports in a timely and diligent manner and inform the users concerned of its decision and any action taken in response to the reports, said IMDA.

The agency did not give a timeframe within which platforms are expected to respond to reports. Platforms are required to prioritise user reports based on severity or imminence as a general principle, IMDA told ST in a separate statement.

IMDA will also collect annual online safety reports from each platform, which the agency will publish online to help users make an informed choice on which platform is best suited to provide a safe user experience.

The report must include details on what steps the platform has taken to mitigate Singapore users’ exposure to harmful content, how much and what types of harmful content users here encounter on the service, and what actions were taken on user reports.

The first batch of reports from the platforms has to be submitted in the second half of 2024.

“These annual online safety reports will provide information about measures designated social media services have put in place to combat harmful content and how Singapore users’ experience on the services has been,” said IMDA.

Social media services on the list were chosen based on their reach and the impact they have on communities here, and this may be reviewed in the future, said IMDA in response to queries from ST.

IMDA will rely on the platforms’ safety reports and feedback from the public to assess the social media services’ compliance and take action when needed.

“If it is evident that a designated (platform) has a systemic failure to act on user reports about harmful content in a timely and diligent manner, IMDA will engage the service to take steps to rectify the non-compliance or systemic failure,” it said.

Minister for Communications and Information Josephine Teo said in an Instagram post on Monday that social media platforms have to do more to protect young users in particular.

“Any social media user can attest to having seen sexually explicit, violent, or even self-harm content online,” she wrote.

“But laws alone cannot solve the problem. When you come across harmful content, do not ignore or further circulate it,” she added. “Let’s all do our part by reporting it to the social media platform immediately and encouraging others to do the same.”

See more on