Facebook 'aware of harm it is doing but won't act'

Whistle-blower tells UK lawmakers social media firm needs urgent external regulation

LONDON • Ms Frances Haugen, the former Facebook product manager and now whistle-blower, has appeared before British lawmakers, painting a portrait of a company vividly aware of its harmful effects on society but unwilling to act because doing so could jeopardise its profits and growth.

Ms Haugen's testimony before the British Parliament was the latest step in her choreographed campaign to build a case for stiffer oversight of the social media giant.

Hours before she began speaking in London on Monday, more than a dozen news organisations published articles based on the Facebook Papers, a cache of documents she took before resigning from the company.

In the coming weeks, she is due to meet officials in France, Germany and the European Union about new laws that she says are necessary to force Facebook to recalibrate how it measures success more towards the public good.

"We need regulation," she said. "Until the incentives change, Facebook will not change."

Even for Facebook, a company that has lurched from controversy to controversy since Mr Mark Zuckerberg started it as a Harvard University undergrad in 2004, Ms Haugen's disclosures have created a backlash that stands apart.

The revelations have generated increased political support for new regulation in the United States and Europe, including some calls for Mr Zuckerberg to step aside as CEO, putting Facebook on the defensive. The growing rancour could lead to new government investigations and force the company to disclose more details about how its software works.

"Facebook is failing to prevent harm to children, it's failing to stop the spread of disinformation, it is failing to stop the spread of hate speech," Mr John Nicolson, a lawmaker from Scotland, said during the hearing. "It does have the power to deal with these issues, it's just choosing not to."

Ms Haugen left Facebook with scores of internal research, slide decks, discussion threads, presentations and memos that she has shared with lawmakers, regulators and journalists.

The information provides an unvarnished view of how some within Facebook tried to raise alarms about its harmful effects, but often struggled to get the company's leaders to act. Facebook defended its practices and said it had spent US$13 billion (S$17.5 billion) and hired 40,000 people to work on safety issues.

After leaking internal company documents to The Wall Street Journal that resulted in a series of articles that began in September, Ms Haugen revealed her identity earlier this month for an episode on 60 Minutes and testified before a Senate committee.

She also shared the documents with the Securities and Exchange Commission. Since then, she has shared the Facebook materials with other news organisations, including The New York Times, resulting in additional stories about the social media platform's harmful effects, including its role in spreading election misinformation in the US and stoking divisions in countries such as India.

Ms Haugen's visit to Europe is a reflection of the region's aggressive approach to tech regulation and a belief that its policymakers are expected to act faster than the US to pass new laws aiming at Facebook and other tech giants.

In London, Ms Haugen told policymakers that regulation could offset Facebook's corporate culture that rewards ideas that get people to spend more time scrolling their social media feeds, but views safety issues as a less important "cost centre".

Facebook's influence is particularly bad in areas of Africa, Asia and the Middle East where its services are popular but the company does not have language or cultural expertise, Ms Haugen said.

Without government intervention, she told lawmakers, events in countries such as Ethiopia and Myanmar, where Facebook has been accused of contributing to ethnic violence, are the "opening chapters of a novel that is going to be horrific to read".

Without government-mandated transparency, Facebook can present a false picture of its efforts to address hate speech and other extreme content, she added. The company says artificial intelligence software catches more than 90 per cent of hate speech, but Ms Haugen said it was less than 5 per cent. "They are very good at dancing with data," she said.


A version of this article appeared in the print edition of The Straits Times on October 27, 2021, with the headline 'Facebook 'aware of harm it is doing but won't act''. Subscribe