Amid TikTok scrutiny, how effective are parental curbs on social media?

A 2021 survey of parents in Singapore found 36 per cent of their children use TikTok, which has an estimated 1.8 million users here. PHOTO: ST FILE

SINGAPORE - The issue of protecting young users of social media apps against potentially harmful content has been thrown into the spotlight as TikTok faces intense scrutiny in the United States over concerns about data privacy and online harm.

Most social media platforms in Singapore are designed for users aged 13 years and above, but none has foolproof measures to stop underage users from lying about their age when they register for accounts. Most platforms do not have measures to verify a user’s age beyond simply asking for a declaration.

A 2021 survey in Singapore of parents of children between seven and 16 years old by Milieu Insight found that half of their children used Instagram and Facebook.

It also found that 36 per cent of the children used TikTok, which has an estimated two million users in Singapore.

Such surveys show that many children are using the app despite being under the age of 13. Between the first and third quarter of 2022, some 60 million suspected underage accounts were removed by TikTok, it said.

The chief executive of TikTok, Singaporean Chew Shou Zi, was grilled in the United States Congress on March 23 as lawmakers sought to ban the short-form video app which has amassed at least 150 million users in the US and more than one billion users globally.

In Congress, US lawmakers questioned the platform’s content moderation policies and showed a collection of TikTok videos that appeared to glorify suicide and self-harm. Mr Chew said TikTok takes a serious view on such content, and later added that his children do not use the app.

The Straits Times outlines the checks in place on the app and other social media platforms, and the controls available to parents to protect young users.

TikTok

In the version of TikTok available in Singapore, users under 13 are barred from creating an account. However, the app does not require users to prove their age.

TikTok said in March that it works to remove underage accounts that are either identified or flagged to the company via its online form.

The app also has family-friendly settings that allow parents to fine-tune their teen’s experience on the platform.

The teen must first accept his parent’s request to be supervised on the app, which will give the adult the power to limit the teen’s time spent watching videos.

A parent can also make certain subjects off-limits to a young user by deciding what content, users and hashtags the teen can search for.

These limits can be imposed only if the user agrees to be supervised. 

Users need to be at least 16 years old before they can create Duet or Stitch content with friends – videos posted that are combined with another TikToker’s video. To live-stream, users have to be above 18.

Those under 18 also have a 60-minute daily screen time limit, and a weekly recap of their usage is also sent to them.

Instagram

Like TikTok, parents are given tools to control their teens’ experience on Instagram, provided their children accept supervision on the mobile app.

Once supervision is activated, parents can monitor their children’s time on Instagram, schedule breaks and see who is following them, including the latest followers.

Parents – or teens – can disable supervision at any time, and the setting is automatically removed once the child turns 18.

The app bars under-13s from signing up, but cannot stop people from lying about their birthdate.

Instagram has rolled out features to protect under-18 users, like preventing them from replying to messages from adults whom they do not follow.

Meta will prevent adults who have been flagged for suspicious behaviour from interacting with other young people’s accounts, and will not recommend young people’s accounts for others to follow.

Snapchat

Famed for its short-form videos and short-term posts, Snapchat maintains a cult following, especially among teenagers.

The American platform rolled out a slew of family-targeted settings, including Family Center for parents and caregivers to supervise teens who accept being supervised.

The setting gives parents an overview of the people their teens have contacted, without disclosing the contents of those messages. Parents will also be able to see who their teens are friends with and restrict their access to sensitive content.

Facebook, Twitter, Reddit and Discord

Open to users 13 years old and above in Singapore, these social media and networking platforms do not have parental controls that allow direct supervision of a child’s account.

Facebook, which has close to three billion users, is investing in artificial intelligence that can detect if someone is a teen or an adult. It is also looking at ways to verify users’ ages.

Parents are encouraged to work with their children to toggle controls, like setting their accounts to private mode, limiting personal information displayed to others, and restricting friend requests and messages.

Explicit content is easily found on many of these platforms, though there are settings to cut off interaction with related accounts or to limit explicit content.

Within the settings of Discord, a networking, community-focused platform popular among gamers, users can task the app to scan and delete direct messages that contain explicit content, and restrict messages to members they know.

Correction note: This article has been edited for clarity.

Join ST's Telegram channel and get the latest breaking news delivered to you.