Facebook, Twitter, YouTube pressed by US lawmakers on algorithms

Monika Bickert, Facebook's vice-president for content policy, makes an opening statement during the hearing.
Monika Bickert, Facebook's vice-president for content policy, makes an opening statement during the hearing.PHOTO: EPA-EFE

WASHINGTON (BLOOMBERG) - Executives from Facebook, Twitter and Alphabet's YouTube are fielding questions from senators about how user content is shared and highlighted on their platforms as lawmakers weigh changes to liability protections for Internet giants.

Tuesday's hearing by the Senate Judiciary Committee's panel on Privacy, Technology and the Law is focusing on algorithms - the lines of software code that determine how user-generated information is displayed and who gets to see it.

The panel is scrutinising how those formulas shape public discourse.

"I plan to use this hearing as an opportunity to learn about how these companies' algorithms work, what steps may have been taken to reduce algorithmic amplification that is harmful and what can be done better," said Delaware Senator Chris Coons, a Democrat and the subcommittee's chair, as he opened the hearing.

Nebraska Senator Ben Sasse, the panel's ranking Republican, said "algorithms, like almost any technologies that are new, have costs and benefits" in his opening remarks.

He said algorithms can be misused, "driving us into poisonous echo chambers."

The hearing comes as Congress takes a broader look at how to overhaul Section 230, a provision of the 1996 communications law that protects Internet companies from liability for user content. One House proposal would make social media platforms responsible for the way content is shared and amplified through algorithms.

Illinois Senator Dick Durbin, the Democratic chair of the full Judiciary Committee, urged social media companies to do more to remove harmful content, citing the Jan 6 attack on the US Capitol. He said domestic extremists organised and shared disinformation on some of the platforms represented at Tuesday's hearing.

Witnesses include Monika Bickert, Facebook's vice-president for content policy, Alexandra Veitch, YouTube's director of government affairs and public policy for the Americas and emerging markets and Lauren Culbertson, Twitter's head of US public policy.

Bickert emphasised Facebook's tools to make the platform's algorithm more transparent, so users can see why certain posts appear on their news feed. She said Facebook is working to improve the content each person sees, to make it more relevant and meaningful for them.

"It is not in our interest financially or reputationally" to push people towards extremist content, Bickert said.

In her opening statement, Culbertson highlighted the positive uses for algorithms and machine learning, especially the ability to recognise harmful content to review and remove. She said Twitter is committed to studying the unintended consequences of algorithms and to giving users more choice over how algorithms shape their experience.

"As members of Congress and other policy makers debate the future of Internet regulation, they should closely consider the ways technology, algorithms, and machine learning make Twitter a safer place for the public conversation and enhance the global experience with the internet at large," Culbertson said.

Veitch said YouTube also uses an automated process to detect videos that violate the company's policies, and algorithms can be used to promote trusted sources and minimise content that's questionable. She described YouTube as "not just a hobby, but a business" for people who create and share videos on the platform.

Information - and disinformation

The role that algorithms play in sharing information - and disinformation - has taken on renewed importance as people turn to social media to learn and comment on issues such as Covid-19 vaccines, protests over police killings and election security. As Durbin indicated, the platforms have been under increased scrutiny since supporters of former president Donald Trump amplified disinformation ahead of the Jan 6 attack.

Trump was suspended by Facebook, Twitter and YouTube for comments that the companies said could lead to violence. Facebook's Oversight Board is reviewing the decision, while YouTube has left open the possibility of reversing the suspension.

Twitter said its ban of Trump's account is permanent.

Facebook has been advocating for updated Internet regulation, including new privacy rules. It has also called for election protection measures and an overhaul of section 230 to require more transparency, reporting requirements and best-practice guidelines for larger companies. As part of this campaign, Facebook is buying ads in the nation's capitol pointing out how much the internet has changed in the 25 years since current regulations became law.