Facebook Papers reveal social network's problems in Asia

Facebook is said to a hotbed of hate speech and misinformation in non-English-speaking Asian countries, according to findings from leaked papers. PHOTO: AFP

Facebook has been embroiled in a wave of leaks since former employee Frances Haugen turned whistle-blower by releasing documents on the inner workings of the social media platform.

On Monday (Oct 25), more about the company was revealed in what has come to be known as the Facebook Papers, as redacted versions of the internal documents have been provided by Ms Haugen's legal counsel to the US Congress and then obtained by 17 American news organisations.

These are some of the findings gathered from those files:

1. Facebook is a hotbed of hate speech, misinformation in non-English-speaking Asian countries

An internal memo dated in 2019 showed that the company has long had "compelling evidence" that its platform's basic functions such as "recommendations and optimising for engagement" were helping to "actively promote" hate speech and misinformation activities.

According to documents viewed by The New York Times, Facebook tested its algorithm to see what it was like to experience the platform as a user in Kerala, India. After three weeks, the test user's news feed reportedly descended into "a near constant barrage of polarising nationalist content, misinformation, and violence and gore".

Even then, the company remains overwhelmingly focused on its English-speaking user base. Ms Haugen said that 87 per cent of the spending on combating misinformation at Facebook is spent on English content when only 9 per cent of users are English speakers.

Resource problems reportedly led to Facebook scaling back efforts to limit misinformation posts in Myanmar, according to Mashable, despite initially demoting those false posts during elections in November last year.

The tech news site said Facebook's artificial intelligence is trained only in five languages, not including Burmese, and the lack of resources devoted to combating false posts may have helped "inflame" the February coup.

2. Zuckerberg personally acceded to Vietnam's censorship demands

Late last year, Vietnam's ruling Communist Party wanted Facebook to censor anti-government dissidents or it would ban its platforms in the country.

Facebook chief executive Mark Zuckerberg decided that the company would comply, according to The Washington Post, bowing to demands to remain online in a market where the social network earns more than an estimated US$1 billion (S$1.3 billion) in revenue.

Before Vietnam's party congress in January this year, when the country picked its Central Committee for the next five years, Facebook took down posts that were "anti-state", affording the government an almost absolute authority over the social network during voting. According to the company's Transparency Report, the number of times Facebook restricted content in Vietnam has gone up by 983 per cent since 2019.

The Vietnamese authorities have been known to restrict free speech and this year sentenced three freelance journalists to prison terms of between 11 and 15 years after they were found guilty of spreading anti-state propaganda.

Vietnam expert Nguyen Khac Giang of the Victoria University of Wellington in New Zealand told the BBC that most freelance journalists chose to publish on Facebook, used by millions in the country.

"We've faced additional pressure from the government of Vietnam to restrict more content. However, we will do everything we can to ensure that our services remain available so people can continue to express themselves," a Facebook spokesman told the BBC last year, in defending the company's decision to comply with the Vietnamese government's demands.

Facebook chief executive Mark Zuckerberg decided that the company would comply with Vietnam's ruling party to censor anti-government dissidents. PHOTO: AFP

3. Facebook allocates resources to countries in 'tiers' during elections

At an internal Civic Summit in 2019, Facebook announced intentions to protect elections around the world and sorted countries into different "tiers" for monitoring, reported tech news site The Verge.

India, Brazil and the United States were in "Tier Zero", the highest priority, for which "war rooms" were set up to monitor the network round the clock for problems that staff would highlight to alert local election officials.

Despite this allocation of resources, misinformation has been allowed to spread in places like India, where its 340 million users make the country one of Facebook's largest markets.

Documents showed that Facebook conducted an undated study into Rashtriya Swayamsevak Sang, a nationalist organisation in India. The group had used the platform to disseminate inflammatory and misleading content.

But much of the content is "never flagged or actioned" due to Facebook's lack of non-English language resources and machine-learning classifiers to detect hate speech.

According to Mashable, misinformation content or the majority of divisive posts in Hindi or Bengali - two of India's most used languages - never gets flagged due to inadequate data.

Other countries in the lower tiers are given fewer resources, and get little protection during elections unless content is intentionally escalated by content moderators despite Facebook effectively being the Internet for users in some of these countries.

4. Apple threatened to ban Facebook, Instagram from its App Store

Apple, the maker of iPhone and iPad devices, once issued a threat to pull Facebook and photo-sharing platform Instagram from its App Store over concerns they were used to sell women as maids in the Middle East and South Asia.

Purported ads for maids with pictures of women and their biographic details, as well as prices, had been shared on the platforms, according to AP.

Apple's threat was dropped only after Facebook responded by disabling more than 1,000 accounts. Facebook admitted it had been "under-enforcing on confirmed abusive activity".

Join ST's Telegram channel and get the latest breaking news delivered to you.