Former Meta employee tells Senate company failed to protect teens’ safety

Meta said that it is committed to protecting young people online. PHOTO: AFP

NEW YORK - A former Meta employee testified before a US Senate sub-committee on Tuesday, alleging that the Facebook and Instagram parent company was aware of harassment and other harms facing teens on its platforms, but failed to address them.

The employee, Mr Arturo Bejar, worked on well-being for Instagram from 2019 to 2021, and earlier was a director of engineering for Facebook’s Protect and Care team from 2009 to 2015, he said.

Mr Bejar testified before the Senate Judiciary Sub-committee on Privacy, Technology and the Law at a hearing about social media and its impact on teen mental health.

“It’s time that the public and parents understand the true level of harm posed by these ‘products’ and it’s time that young users have the tools to report and suppress online abuse,” he said in written remarks made available before the hearing.

Mr Bejar’s testimony comes amid a bipartisan push in Congress to pass legislation that would require social media platforms to provide parents with tools to protect children online.

The goal of his work at Meta was to influence the design of Facebook and Instagram in ways that would nudge users towards more positive behaviours and provide tools for young people to manage unpleasant experiences, Mr Bejar said at the hearing.

Meta said in a statement that it is committed to protecting young people online, pointing to its backing of the same user surveys Mr Bejar cited in his testimony and its creation of tools like anonymous notifications of potentially hurtful content.

“Every day countless people inside and outside of Meta are working on how to help keep young people safe online,” the Meta statement said. “All of this work continues.”

Mr Bejar told senators that he regularly met senior executives at the company, including chief executive Mark Zuckerberg, and considered them supportive of the work at the time.

However, he concluded subsequently that the executives had decided “time and time again to not tackle this issue”, he testified.

In one 2021 e-mail, Mr Bejar flagged to Mr Zuckerberg and other top executives internal data revealing that 51 per cent of Instagram users had reported having a bad or harmful experience on the platform in the past seven days, and that 24.4 per cent of children aged 13 to 15 had reported receiving unwanted sexual advances.

A separate document showed that 13 per cent of all 13- to 15-year-old Instagram users surveyed said they had received unwanted advances.

Mr Bejar also told them that his own 16-year-old daughter had been sent misogynistic comments and obscene photos, without adequate tools to report those experiences to the company. The existence of the e-mail was first reported by the Wall Street Journal.

In his testimony, Mr Bejar recounted that, in one meeting, Meta chief product officer Chris Cox was able to cite precise statistics on teen harms off the top of his head.

“I found it heartbreaking because it meant that they knew and that they were not acting on it,” said Mr Bejar. REUTERS

Join ST's Telegram channel and get the latest breaking news delivered to you.