NEW YORK (NYTIMES) - Last week, after frustrated activists from Myanmar sent an open letter to Mark Zuckerberg, the chief executive of Facebook, they got something unexpected: a reply.
The activists, representing six civil society organisations, harshly criticised Zuckerberg in the letter, saying he had mischaracterised Facebook's response to violence-inciting messages in Myanmar and had not devoted sufficient resources to enforcing its hate speech rules in the violence-stricken country.
Zuckerberg wrote back to the group the next day from his personal email address, apologising for misspeaking and outlining steps that Facebook was taking to increase its moderation efforts.
Zuckerberg's email, which was provided to The New York Times by the activist groups, was the chief executive's first direct communication with the local groups that have criticised Facebook's role in the country's growing humanitarian crisis. Facebook has been accused by United Nations investigators and human rights groups of facilitating violence against Rohingya Muslims, a minority ethnic group, by allowing anti-Muslim hate speech and false news to spread on its platform.
Facebook is a dominant source of information in Myanmar, and civil society groups have accused it of being a kind of absentee landlord, with few moderators and systems in place to keep extremists from using Facebook posts to incite violence.
In his email, Zuckerberg said Facebook had added "dozens" of Burmese language content reviewers to monitor reports of hate speech and had "increased the number of people across the company on Myanmar-related issues," including a product team working on building tools to try to help stem the violence there.
The disagreement centres on a chain letter that spread on Facebook Messenger in Myanmar in September. The messages warned Buddhist communities of an imminent Muslim attack. Meanwhile, Muslim populations received a separate message cautioning them of violence from militant Buddhist groups.
Civil society groups say the messages paralysed major cities in Myanmar and raised fears of a violent clash. Such incitement and scaremongering have become far too typical on Facebook, according to the groups, which say Facebook has repeatedly failed to follow through on promises to devote more resources to the issues.
In an interview last week, Zuckerberg appeared to hold up the September episode as a model of Facebook's effectiveness, and said the company's systems had detected the messages and stopped them. In fact, the activists said, they flagged the messages repeatedly to Facebook, barraging its employees with strongly worded appeals until the company finally stepped in to help.
Zuckerberg's personal email did not quell the activists' frustration. The groups say the biggest obstruction to their attempts to push back against a torrent of dangerous hate speech is not their lack of resources but Facebook itself. They said Facebook had a history of pledging to do more to help quell ethnic violence in Myanmar but had not fulfilled its promises.
"It's great that he's engaging personally with this, but the stuff he's talking about is really not that much different from what they've been saying for the past few years," said Jes Petersen, chief executive of Phandeeyar, an innovation lab in Myanmar that has worked with Facebook to produce localised versions of its community standards.
A Facebook spokeswoman, Debbie Frost, confirmed the authenticity of Zuckerberg's email, and said Facebook was planning to continue engaging with the activists.
Years after civil society groups first began flagging hate speech in Myanmar, the company still has no permanent office or staff in the country and seems to be struggling to give its platform sufficient oversight.
In Germany, where hate speech laws require vigilant attention from content reviewers, Facebook has hired about 1,200 moderators. In order to achieve the same ratio of users to moderators in Myanmar, Facebook would need to have around 800 reviewers in the country, Petersen calculated.
"Dozens of content reviewers is not going to cut it," he said.
The civil society groups have already responded to Zuckerberg's reply, asking for hard data about Facebook's efforts in the region, including how many Burmese-speaking reviewers the company has, how many accounts the company has taken down in Myanmar and how long, on average, it takes for Facebook to respond to reports of hate speech.
"A lot of what they've been doing is cosmetic - it's not the tangible improvement we're looking for," said Victoire Rio, a social media analyst in Myanmar who was named in Zuckerberg's reply.
Activists in other developing countries have raised similar complaints about Facebook's behaviour. In Indonesia, politicians have called Facebook executives to account for the spread of disinformation. In the Philippines, critics of President Rodrigo Duterte have faced barrages of threatening posts. Last month, the government of Sri Lanka ordered Facebook blocked in an attempt to stem mob violence against Muslim communities.
Last month, Adam Mosseri, Facebook's News Feed head, said in an interview that he and other Facebook executives "lose some sleep" over the possibility that Facebook had led to real-world violence.
Petersen said he hoped Zuckerberg's appeal would spur actual change and not just expressions of worry. "I wonder how he spent those sleepless nights - because we didn't see that much change," he said.
Here is the full text of Zuckerberg's email to the civil society groups, followed by the groups' response:
"Dear Htaike Htaike, Jes, Victoire, Phyu Phyu and Thant, I wanted to personally respond to your open letter. Thank you for writing it and I apologise for not being sufficiently clear about the important role that your organisations play in helping us understand and respond to Myanmar-related issues, including the September incident you referred to.
In making my remarks, my intention was to highlight how we're building artificial intelligence to help us better identify abusive, hateful or false content even before it is flagged by our community.
These improvements in technology and tools are the kinds of solutions that your organisations have called on us to implement and we are committed to doing even more. For example, we are rolling out improvements to our reporting mechanism in Messenger to make it easier to find and simpler for people to report conversations.
In addition to improving our technology and tools, we have added dozens more Burmese language reviewers to handle reports from users across all our services. We have also increased the number of people across the company on Myanmar-related issues and we now have a special product team working to better understand the specific local challenges and build the right tools to help keep people there safe.
There are several other improvements we have made or are making, and I have directed my teams to ensure we are doing all we can to get your feedback and keep you informed.
We are grateful for your support as we map out our ongoing work in Myanmar, and we are committed to working with you to find more ways to be responsive to these important issues."
The civil society groups' response.
The half-dozen signatories of the response include Phandeeyar, a leading technology hub in the country; the Myanmar ICT for Development Organization, which monitors online hate speech; and the Center for Social Integrity.
"Dear Mark, Thank you for responding to our letter from your personal email account. It means a lot.
We also appreciate your reiteration of the steps Facebook has taken and intends to take to improve your performance in Myanmar.
This doesn't change our core belief that your proposed improvements are nowhere near enough to ensure that Myanmar users are provided with the same standards of care as users in the US or Europe.
When things go wrong in Myanmar, the consequences can be really serious - potentially disastrous. You have yourself publicly acknowledged the risk of the platform being abused towards real harm.
Like many discussions we have had with your policy team previously, your email focuses on inputs. We care about performance, progress and positive outcomes.
In the spirit of transparency, we would greatly appreciate if you could provide us with the following indicators, starting with the month of March 2018:
- How many reports of abuse have you received?
- What per cent of reported abuses did your team ultimately remove due to violations of the community standards?
- How many accounts were behind flagging the reports received?
- What was the average time it took for your review team to provide a final response to users of the reports they have raised? What percent of the reports received took more than 48 hours to receive a review?
- Do you have a target for review times? Data from our own monitoring suggests that you might have an internal standard for review - with most reported posts being reviewed shortly after the 48 hrs mark. Is this accurate?
- How many fake accounts did you identify and remove?
- How many accounts did you subject to a temporary ban? How many did you ban from the platform?
Improved performance comes with investments and we would also like to ask for more clarifications around these. Most importantly, we would like to know: - How many Myanmar speaking reviewers did you have, in total, as of March 2018? How many do you expect to have by the end of the year? We are specifically interested in reviewers working on the Facebook service and looking for full-time equivalents figure.
- What mechanisms do you have in place for stopping repeat offenders in Myanmar? We know for a fact that fake accounts remain a key issue and that individuals who were found to violate the community standards on a number of occasions continue to have a presence on the platform.
- What steps have you taken to date to address the duplicate posts issue we raised in the briefing we provided your team in December 2017?
We're enclosing our December briefing for your reference, as it further elaborates on the challenges we have been trying to work through with Facebook."