NEW YORK • Facebook chief executive officer Mark Zuckerberg has called for new global regulations governing the Internet, recommending overarching rules on hateful and violent content, election integrity, privacy and data portability.
In a statement that was also published as an op-ed in The Washington Post last Saturday, he said the company is seeking regulations that would set baselines for prohibited content and require companies to build systems for keeping harmful content to a minimum.
"We have a responsibility to keep people safe on our services," he said. "That means deciding what counts as terrorist propaganda, hate speech and more. We continually review our policies with experts, but at our scale, we will always make mistakes and decisions that people disagree with."
Facebook has been the target of probes by various governments after news broke about a year ago that it allowed the personal data of tens of millions of users to be shared with political consultancy Cambridge Analytica.
Last month, it came under fire for taking too long to take down a live video of a shooting in New Zealand.
Separately, millions of users had personal information accessed via a recent breach.
Over the past year, lawmakers have focused greater scrutiny on the firm and its immense influence, asking its executives to testify in front of the United States Congress to explain the proliferation of misinformation, hate speech and election manipulation on the platform.
The technology industry has long said that Section 230 of the Communications Decency Act is vital to its ability to operate open platforms. The provision exempts firms from being liable for user-generated content.
Facebook built a content-scanning system that over the years has added rules based on reactions to changes in user behaviour or public uproar after an incident such as the New Zealand mass shooting.
When the website's users or computer systems report posts as problematic, the posts are sent to one of the company's 15,000 content moderators around the world, who are allowed to take down content only if it violates a rule. But that process is not always precise.
"Lawmakers often tell me we have too much power over speech, and frankly I agree," Mr Zuckerberg wrote in his statement. "I have come to believe that we shouldn't make so many important decisions about speech on our own."
He said Facebook would welcome standards for verifying political actors, citing practices deployed by advertisers in many countries of verifying identities before buying political advertisements. He suggested updating laws to include "divisive political issues", in addition to candidates and elections.
SHARING THE RESPONSIBILITY
Lawmakers often tell me we have too much power over speech, and frankly I agree. I have come to believe that we shouldn't make so many important decisions about speech on our own.
FACEBOOK CHIEF EXECUTIVE OFFICER MARK ZUCKERBERG
The billionaire said it would be good for the Internet if more countries adopted rules such as the European Union's General Data Protection Regulation as a common framework. Privacy regulations "should protect your right to choose how your information is used - while enabling companies to use information for safety purposes and to provide services", he said. "It shouldn't require data to be stored locally, which would make it more vulnerable to unwarranted access."
Mr Zuckerberg added that there should also be rules guaranteeing portability of data to protect information when it moves between services. His willingness to embrace regulation could pave the way towards taking the thorniest problems about speech and privacy out of Facebook's hands - or at least give the company more time to solve them.
Singapore Prime Minister Lee Hsien Loong said last Friday that a law will be introduced that will require online news sites to publish corrections or warnings on fake news, or even remove such articles in extreme cases.
Meanwhile, a new law proposed by the Australian government provides for a jail term of up to three years for social media executives and fines of up to 10 per cent of annual turnover for their firms if they fail to quickly remove violent material from their platforms.