Australia’s online safety regulator responds fast to remove harmful content

eSafety Commissioner Julie Inman Grant delivering her keynote at the Online Harms Symposium at the Singapore Management University on Sept 26. PHOTO: MINISTRY OF LAW

SINGAPORE – A video showing a transgender indigenous Australian youngster being beaten up in school was removed from Instagram just 12 minutes after it was reported to eSafety – Australia’s online safety regulator.

It was one of the quickest removals of harmful online content carried out by the agency, eSafety commissioner Julie Inman Grant told The Straits Times on Monday, in an interview during the three-day Online Harms Symposium.

At the conference, held at the Singapore Management University from Monday to Wednesday, Ms Inman Grant underscored the advantages of a dedicated and independent safety regulator in tackling online harms.

eSafety acts to protect people in Australia when social media platforms fail to remove harmful content.

The regulator was established in 2015 to protect children from online harms, and its scope subsequently expanded to tackle online harms against people of all ages, said Ms Inman Grant, who took on her role in 2017.

Online safety laws that came into force in 2022 grant the regulator the power to order social media firms to remove flagged harmful content within 24 hours or face penalties.

Singapore adopted a similar approach under the Online Safety (Miscellaneous Amendments) Act.

It gives the authorities the power to order social media platforms here to remove egregious content, and prevent cases of harassment, child sex exploitation and terrorism.

In Australia, eSafety has helped to speed up response to victims to three hours in most cases, said Ms Inman Grant.

Speaking at the symposium on Tuesday, she said victims of online harms can typically seek recourse from the online platforms or the police.

But getting the posts removed through these avenues tends to take time, said Ms Inman Grant, who cited examples of online sextortion – a form of blackmail where someone threatens to share intimate videos of another person.

“Often when people come to us, they just want their images taken down, they don’t want to be tethered to their former partner through litigation, which of course, can take a long time,” Ms Inman Grant told a room of industry players and lawyers, among other guests.

“So what we provide is rapid assistance. I don’t think we would have the same degree of success with the platforms, particularly where there are grey areas, if we didn’t have these (remedial) powers.”

eSafety is now the go-to platform for matters of online harm in Australia.

A red button labelled “Report abuse” is prominently displayed on the agency’s website, allowing people to report cases of cyber bullying, and image-based abuse like the circulation of intimate photos and videos.

Victims typically get a response within three hours, said Ms Inman Grant on stage.

The regulator can then order social media platforms to take down the harmful material or face a fine.

eSafety has investigated some 8,000 instances of abhorrent or violent online material, and has issued at least 46 notices against content related to murder, terrorism and other crimes.

It is also empowered to order a stop to the broadcast of terrorist content online in the event of a crisis, said Ms Inman Grant, citing the case of the 2019 mosque shootings in Christchurch, New Zealand.

The mass shooting was broadcast on Facebook Live via a headcam the gunman was wearing.

Roughly 90 per cent of harmful content flagged to social media companies is removed, she added.

She said: “While this number of investigations may seem relatively small compared to the large amount of violent content existing online, we do have to prioritise and be realistic.”

She added: “A small agency in Australia cannot go to war with the Internet alone. All this illustrates why it is so important for other governments around the world to strengthen laws to prevent the hosting of illegal material.”

Australian eSafety commissioner Julie Inman Grant said her agency typically responds to reports of abuse within three hours. The agency has investigated 8,000 instances of abhorrent or violent online material. ST PHOTO: JASON QUAH

On Monday, Singapore Minister for Law and Home Affairs K. Shanmugam said in a speech at the conference that more changes to the law needed to be made here to better protect victims.

He said Singapore had legal frameworks to take down harmful content circulating on social media, but there is still a need to consider how victims can better protect themselves, and encouraged discussions on how to improve the system.

Lawyer and SG Her Empowerment (SHE) founding chairwoman Stefanie Yuen Thio said at the online harms panel discussion that she was in support of an independent agency for online safety.

SHE, a non-profit group which supports victims of online harm, has helped at least 66 clients since it was launched in January. 

“I’d like to see some kind of independent agency that is able to do the stuff that (eSafety) is doing,” she said, adding that the group’s surveys have found that the main concern among victims was the ability to take immediate action.

Join ST's Telegram channel and get the latest breaking news delivered to you.