Skip to main contentSkip to navigationSkip to navigation
‘Facebook: a giant, uncoordinated toddler that repeatedly soils its diaper and then wonders where the stench is coming from.’
‘Facebook: a giant, uncoordinated toddler that repeatedly soils its diaper and then wonders where the stench is coming from.’ Photograph: Leah Millis/Reuters
‘Facebook: a giant, uncoordinated toddler that repeatedly soils its diaper and then wonders where the stench is coming from.’ Photograph: Leah Millis/Reuters

Facebook's failure in Myanmar is the work of a blundering toddler

This article is more than 5 years old
in San Francisco

The social network ploughs its way through the world and deals with the consequences later. In Myanmar, that strategy has had deadly consequences

When Facebook invited journalists for a phone briefing on Tuesday evening to talk about its progress in tackling hate speech in Myanmar, it seemed like a proactive, well-intentioned move from a company that is typically fighting PR fires on several fronts.

But the publication of a bombshell Reuters investigation on Wednesday morning suggested otherwise: the press briefing was an ass-covering exercise.

This is the latest in a series of strategic mishaps as the social network blunders its way through the world like a giant, uncoordinated toddler that repeatedly soils its diaper and then wonders where the stench is coming from. It enters markets with wide-eyed innocence and a mission to “build [and monetise] communities”, but ends up tripping over democracies and landing in a pile of ethnic cleansing. Oopsie!

Human rights groups and researchers have been warning Facebook that its platform was being used to spread misinformation and promote hatred of Muslims, particularly the Rohingya, since 2013. As its user base exploded to 18 million, so too did hate speech, but the company was slow to react and earlier this year found its platform accused by a UN investigator of fuelling anti-Muslim violence.

The Australian journalist and researcher Aela Callan warned Facebook about the spread of anti-Rohingya posts on the platform in November 2013. She met with the company’s most senior communications and policy executive, Elliott Schrage. He referred her to staff at Internet.org, the company’s effort to connect the developing world, and a couple of Facebook employees who dealt with civil society groups. “He didn’t connect me to anyone inside Facebook who could deal with the actual problem,” she told Reuters.

In mid-2014, after false rumours online about a Muslim man raping a Buddhist woman triggered deadly riots in the city of Mandalay, the Myanmar government requested a crisis meeting with Facebook. Facebook said that government representatives should send an email when they saw examples of dangerous false news and the company would review them.

It took until April this year – four years later – for Mark Zuckerberg to tell Congress that Facebook would step up its efforts to block hate messages in Myanmar, saying “we need to ramp up our effort there dramatically”.

Since then it has deleted some known hate figures from the platform, but this week’s Reuters investigation – which found more than 1,000 posts, images and videos attacking Myanmar’s Muslims – shows there’s a long way to go.

A key issue that civil society groups focus on is Facebook’s lack of Burmese-speaking content moderators. In early 2015, there were just two of them.

Up until Wednesday of this week, Facebook has refused to reveal how many Burmese content reviewers it had hired since.

“We’re still not clear how many Myanmar-speaking reviewers Facebook has despite our repeated requests for transparency around this,” said Ei Myat Noe Khin, the digital rights manager at Phandeeyar, a tech innovation lab in Myanmar. “Facebook has been more proactive reaching out to civil society, but it’s important they don’t rely on civil society as an alternative to hiring more staff.”

On the call on Tuesday evening, Ellen Silver, Facebook’s VP of operations, said the figure would be “misleading” because some content such as nudity doesn’t require local language expertise.

Facebook had a change of heart on Wednesday, after Reuters revealed that the company had 60 reviewers who could speak Burmese. The company published a blogpost including this number, and pledged to hire 40 more by the end of the year.

The company still has no office or staff in Myanmar.

During the call, the company also said that one of the reasons it couldn’t moderate content effectively was because users weren’t using its reporting tools. This might have something to do with the fact that those reporting tools – including the text in drop-down menus attached to objectionable posts – were only translated into Burmese in late April/early May this year.

Advocates warned Facebook that Rohingya have faced violence as the result of misinformation. Photograph: Soe Zeya Tun/Reuters

Facebook is also upgrading its technological systems to proactively identify offending content. Civil society groups have been underwhelmed by results so far.

“Their AI can’t detect real hate speech and rumours. Mostly it detects just the words like ‘Ma Ba Tha’ [a Buddhist monk-led nationalist group] and ‘Buddha religious’ or something like that,” said Myat Thu from Burma Monitor, a not-for-profit. “It’s not the tracing root cause. Only Burmese content reviewers can know the local context.”

One policy change that sounds promising on the face of it, is deleting inaccurate or misleading information created or shared “with the purpose of contributing to or exacerbating violence or physical harm”. Until last month, the company would only de-rank misinformation and only delete threats that were sufficiently specific to meet the company’s credible violence threshold.

It was launched first in Sri Lanka, where disinformation has also triggered a spate of mob violence, and is being rolled out to Myanmar.

When the Guardian asked how the notoriously metrics-focused company would measure the success of the policy, the answer was characteristically mealy-mouthed: “Our goal is to get better at identifying and removing abuses of our platform that spread hate and can contribute to offline violence or harm, so people in Myanmar can safely enjoy the benefits of connectivity.”

When pushed again to specify how it would measure this, a spokeswoman said “that’s difficult”.

In spite of these problems, both Burma Monitor and Phandeeyar welcomed the additional focus on Myanmar after years of inaction.

“It’s really too early to talk of progress, but Facebook is showing more willingness to engage with the problem, which is positive,” said Phandeeyar’s Ei Myat Noe Khin.

“We hope that Facebook is committed to solving these problems over the long term – not just focusing on short term Band-Aid solutions, while there is a high level of public scrutiny,” she added.

Most viewed

Most viewed