‘Ecosystem of accountability’ needed for social media: Facebook whistleblower

Sign up now: Get ST's newsletters delivered to your inbox

ocharm25 -  ST20230925_202343129554. Hester Tan. Frances Haugen, Advocate for Accountability and Transparency in Social Media The Online Harms Symposium, organised by the Ministry of Law and SMU's Yong Pung How School of Law at 55 Armenian Street on 25 September 2023.

Ms Frances Haugen suggested Singapore could act as regional leader, building a coalition with other countries to tackle online harms.

ST PHOTO: HESTER TAN

Follow topic:

SINGAPORE – An “ecosystem of accountability” is needed around social media platforms, where regulators, activists and experts can put pressure on companies to conform to safety standards that mitigate the risk of online harms, said online safety advocate Frances Haugen.

Delivering her keynote address at the Online Harms Symposium at the Singapore Management University (SMU) on Monday, Ms Haugen referenced a comment made by Instagram head Adam Mosseri in 2021 comparing social media to cars.

Mr Mosseri had said that while many died in car accidents, the benefits of the vehicles outweighed the costs, and that this was also the case for social media platforms. 

Ms Haugen noted on Monday, however, that cars were heavily regulated, with various parties pushing for greater safety measures to be introduced over the years. 

“So, why is it we are missing the same ecosystem of accountability, all these different actors with different expertise? Why are we missing this when it comes to social media?” she asked.

A former Facebook employee, Ms Haugen left the social media giant now known as Meta in 2021 and became a whistleblower, leaking internal documents showing

that the company knew of the potential harm caused by its sites.

In 2022, the American

launched Beyond The Screen,

a non-profit organisation that fights online harms.

She pointed to a 2021 blog post by Facebook’s current global affairs president, Mr Nick Clegg, which stated that the algorithms pushing suggested content are driven by factors such as a user’s interests and the accounts they follow.

However, Mr Clegg’s comments did not paint the whole picture, she said, noting that greater engagement with suggested content tends to lead to more extreme content being suggested.

For example, engaging with pages and groups that turn up when searching for “healthy eating” on Facebook could eventually lead to content promoting eating disorders, she suggested.

While Facebook and other social media platforms may have solutions to such problems, there is no financial incentive for them to introduce these solutions, she asserted.

Greater transparency is needed from social media platforms, Ms Haugen said, noting that the current situation is one of “information asymmetry”, where users and regulators have no access to information the companies hold related to the impact of their platforms.

She praised efforts by the European Union to demand greater accountability from social media firms under the Digital Services Act. Passed in August, it requires online platforms to provide greater transparency on their content moderation and algorithmic recommendations, among other measures.

Safety systems also need to account for linguistic diversity, Ms Haugen said, noting that excluding languages and dialects could leave certain communities unprotected online.

Ms Haugen suggested that Singapore could act as regional leader, building a coalition with other countries to tackle online harms.

Later in a panel discussion on solutions for online harms, Institute of Policy Studies principal research fellow Carol Soon noted that various countries introduced or amended laws to deal with online harms.

They include Australia’s Online Safety Act, New Zealand’s Harmful Digital Communications Act and Britain’s Online Safety Bill.

During the discussion, Ms Haugen suggested that countries could introduce a consumer Bill of Rights for online platforms, which would give social media users more rights in understanding why certain content is being pushed to them, for example.

SMU’s Professor Lim Sun Sun noted that victims of online harms often do not report their cases as they feel doing so would be futile, or their experiences would be trivialised by the people around them.

Normalising online harms is an unhealthy attitude to take, she said.

“We should not allow our society to fall into the trap of assuming that online harms are a necessary part and parcel of the online experience. I think once we get into that notion, then we have given up completely.”

See more on