Like a firefighter, IMDA must be equipped to combat online fires: Josephine Teo

Sign up now: Get ST's newsletters delivered to your inbox

Social media platforms need to have safeguards to prevent users, especially children under 18 years old, from accessing harmful content.

IMDA has been quashing the negative effects of egregious content for some time now, said Communications and Information Minister Josephine Teo.

ST PHOTO: GIN TAY

Follow topic:

SINGAPORE - Singapore’s proposed law to tackle online harm will equip its enforcement authorities to minimise, if not prevent, serious injury and damage to society, similar to how firefighting is done, Communications and Information Minister Josephine Teo told Parliament.

At the start of Tuesday’s debate on

the Online Safety (Miscellaneous Amendments) Bill,

Mrs Teo likened the Infocomm Media Development Authority (IMDA) to an “online firefighter”.

The Bill, she said, will unambiguously empower the IMDA to issue orders to social media platforms, including Facebook, Instagram, YouTube and TikTok, to take down egregious content. This includes posts advocating suicide, self-harm, child sexual exploitation and terrorism, as well as materials that may incite racial or religious tensions or pose a risk to public health.

Failure to comply may attract a fine of up to $1 million, or a direction to have their social media services blocked in Singapore. Internet service providers such as Singtel, StarHub and M1 may also face fines of up to $500,000 for failing to block the services in question.

The IMDA has been quashing the negative effects of egregious content for some time now, said Mrs Teo, endorsing the authority’s experience in assessing content across different media platforms and making decisions to protect the community.

She cited a social media post in the early days of the Covid-19 pandemic that suggested that people use the Bible and the Quran instead when supermarkets ran out of toilet paper.

“This post was religiously insensitive, and denigrated two religions in Singapore,” she said, noting that the post was not moderated or removed as global companies’ safety measures did not cater to Singapore’s racial and religious sensitivities.

The IMDA had to step in, and only then was access to the offending post blocked. “Under the Bill, the IMDA will be better equipped to ensure Singapore users are protected from egregious content online,” said Mrs Teo.

When dealing with content that requires the expertise of other agencies, the IMDA will consult them accordingly.

The Bill seeks to amend the Broadcasting Act to regulate providers of online communications services. An accompanying draft Code of Practice for Online Safety, to be imposed on regulated social media platforms, spells out the safeguards needed to prevent users, especially children under 18 years old, from accessing harmful content.

The safeguards include tools that allow children or their parents to manage their safety on these services. Social media firms will also need to provide guidance on content that presents a risk of harm, and tools for users to report harmful content and unwanted interactions.

The code is

expected to be rolled out as early as 2023,

after passage of the Bill and a final round of consultation with the relevant social media firms.

On Tuesday, Mrs Teo argued for an outcome-driven set of rules instead of being overly prescriptive, saying that laws are not a silver bullet.

“By stating in the codes the outcomes which regulated services must meet, the IMDA aims to provide sufficient clarity on what the services must do to protect users, while allowing some flexibility for them to adjust their approaches,” she said.

“Given the voluminous user-generated content in today’s evolving online space, it is not efficient to regulate individual pieces of content.”

The code can be updated from time to time as harms evolve, Mrs Teo added.

During the debate, Mr Gerald Giam (Aljunied GRC) and Mr Zhulkarnain Abdul Rahim (Chua Chu Kang GRC) suggested that Singapore specify a takedown timeline of 24 hours, similar to how online harm is tackled in other jurisdictions.

For instance, Germany’s Network Enforcement Act, which took effect in 2018, requires “obviously illegal” content to be taken down within 24 hours after a user complaint is received, but social media platforms have up to seven days to investigate and delete other offending content.

Australia’s Online Safety Act, which kicked in in January,

also requires bullying content to be taken down within 24 hours.

Mr Zhulkarnain said: “If there is a standard fixed period by legislation, it will lead to a reasonable expectation or standard within the industry for compliance.”

The Bill does not cover private communications such as those that take place on WhatsApp and Facebook Messenger.

Ms Tin Pei Ling (MacPherson) asked why private messaging is not included in the Bill. “There could be instances where objectionable content is shared over private messaging channels,” she said, noting that private messaging is covered by Australia’s Online Safety Act.

The debate continues on Wednesday, with another 12 MPs having indicated that they plan to speak.

See more on