Facebook put on the spot as Rohingya crisis goes on

Ultranationalist Buddhist monk Ashin Wirathu posts daily updates, often containing false information, that spread a narrative of the Rohingya as aggressive outsiders in Myanmar.
Ultranationalist Buddhist monk Ashin Wirathu posts daily updates, often containing false information, that spread a narrative of the Rohingya as aggressive outsiders in Myanmar.PHOTO: NYTIMES

Myanmar's nationalist Buddhist monk using platform to fan hatred of minority group

YANGON • Myanmar's government has barred ultranationalist Buddhist monk Ashin Wirathu from public preaching for the past year, saying his speeches helped fuel the violence against the country's Rohingya ethnic group that the United Nations calls ethnic cleansing.

So he has turned to an even more powerful and ubiquitous platform to get his message out - Facebook. Every day he posts updates, often containing false information, that spread a narrative of the Rohingya as aggressive outsiders.

And posts like these have put Facebook at the centre of a fierce information war that is contributing to the crisis involving the minority group.

Human rights groups say Facebook should be doing more to prevent the hateful speech, focusing as much on global human rights as on its business. "Facebook is quick on taking down swastikas, but then they don't get to Mr Wirathu's hate speech where he's saying Muslims are dogs," said Mr Phil Robertson, deputy director of Human Rights Watch's Asia division.

In Myanmar, Facebook is so dominant that to many people it is the Internet itself. And the stakes of what appears on the site are exceptionally high because misinformation, as well as explicitly hostile language, are widening long-standing ethnic divides and stoking the violence against the Rohingya.

For example, since the most recent round of government crackdown on the Rohingya began in August, Mr Zaw Htay, a spokesman for the country's de facto leader Aung San Suu Kyi, has shared dozens of posts on his Facebook page and Twitter account that include images said to show Rohingya burning their own homes. Many of these images have been debunked, yet they still stand.

First-person accounts from Rakhine state have established a coordinated crackdown against the Rohingya minority by the military and by ultranationalist groups, driving more than 600,000 refugees across the border into Bangladesh.

INCONSISTENT

Facebook is quick on taking down swastikas, but then they don't get to Wirathu's hate speech where he's saying Muslims are dogs.

MR PHIL ROBERTSON, deputy director of Human Rights Watch's Asia division.

Facebook does not police the billions of posts and status updates that flow through the site worldwide each day, relying instead on an oftentimes confusing set of "community standards" and reports by users of direct threats that are then manually assessed and, in some cases, removed. It has also rolled out guidelines to help users identify fake news and misinformation but it does not regularly remove misinformation itself.

Facebook has no office in Myanmar, but the company has worked with local partners to introduce a Burmese-language illustrated copy of its platform standards and will "continue to refine" its practices, said spokesman Clare Wareing in an e-mailed statement.

Because of Facebook's design, posts that are shared and liked more frequently get more prominent placement in feeds, favouring highly partisan content.

Mr Wirathu has hundreds of thousands of followers on his Burmese and English accounts. His posts include graphic photos and videos of decaying bodies that he says are Buddhist victims of Rohingya attacks, or posts denouncing the minority ethnic group. Facebook has removed some of his posts and restricted his page for stretches, but it is currently active.

Mr Wirathu said that if Facebook did remove his account, he would simply create a new one. He added that if anyone did not like his Facebook posts, "they can sue me".

Rohingya activists also use Facebook, documenting human rights abuses, often with graphic images and videos as evidence. Sometimes the social media company has taken these down.

Ms Wareing said Facebook removes graphic content "when it is shared to celebrate the violence". She said it would allow graphic content if it was newsworthy, or important to the public interest, even if it might otherwise go against the platform's standards.

Mr Richard Weir, an Asia analyst with Human Rights Watch, said the situation was complicated. "It's a really delicate balance here between things that are violent and posted by people who would seek to inflame tensions and those that are trying to disseminate information," he said.

NYTIMES

A version of this article appeared in the print edition of The Sunday Times on October 29, 2017, with the headline 'Facebook put on the spot as Rohingya crisis goes on'. Print Edition | Subscribe