Facebook CEO Mark Zuckerberg in Menlo Park, Calif., on Sept. 27, 2015.
Stephen Lam—Reuters
By Laignee Barron
Updated: April 9, 2018 7:28 AM ET | Originally published: April 6, 2018

Of all the reckonings recently brought to bear on Facebook – from its role in election interference to exposure of users’ data – the one that staffers in Menlo Park, Calif. reportedly lose sleep over is the accusation that they facilitated ethnic cleansing in Myanmar.

U.N. investigators have accused Facebook of playing a “determining role” in violence that has driven nearly 700,000 Muslim Rohingya out of the country and killed at least 6,700 people in the first month alone. How exactly, and to what extent, the social media giant affected Myanmar’s military-led campaign of rape, arson and murder remains impossible to quantify, given the absence of available data.

A Facebook spokesperson told TIME via email there is “no place for hate speech” on its platform. But the company does not have an office in Myanmar, and local organizations complained of a lack of Burmese-speaking staff to report inappropriate content to. It can take days or even weeks for flagged content to be removed, they say. In a recent interview with Vox, Facebook CEO Mark Zuckerberg acknowledged the platform’s potential to cause “real world harm” in Myanmar, but noted that when two inflammatory chain messages circulated on Facebook’s Messenger app last September, “our systems detected” them and “stopped those messages from going through.”

Myanmar civil society groups balked at the suggestion that this showed Facebook’s effectiveness; in an open letter shared online Thursday, six organizations criticized what they called the company’s routinely “inadequate response” to improper content, including in the instance Zuckerberg highlighted. They were the ones to report the messages, which were nonetheless allowed to spread for days and which they said “caused widespread fear and at least three violent incidents.”

“This case exemplifies the very opposite of effective moderation: it reveals an over-reliance on third parties, a lack of a proper mechanism for emergency escalation, a reticence to engage local stakeholders around systemic solutions and a lack of transparency,” six groups said in a statement on Thursday. “The risk of Facebook content sparking open violence is arguably nowhere higher right now than in Myanmar,” they said.

Some are skeptical about the extent of Facebook’s influence, citing pogroms that predate the platform’s existence, and other channels – including state-backed television and newspapers – used to legitimize abuse of the country’s estimated 1.1 million Rohingya. Mark Farmaner, director of Burma Campaign UK, tells TIME that, “violence against the Rohingya would have happened with or without Facebook,” adding that doesn’t absolve the company of the need to combat hate speech metastasizing on its platform.

Yet people working in Southeast Asia have long warned of the platform’s potential to weaponize information, amplify ethnic tensions and even incite violence. Facebook arrived in the former pariah state around the same time as the Internet and smartphones. Facebook’s ubiquity in Myanmar is not only part of the problem, it’s also emblematic of what can go wrong when the world’s largest social network also serves as the singular forum for political discourse, news and commerce.

“In Myanmar, Facebook serves as more than a space for social activity and liking cat videos; even the president used the platform to announce his resignation,” says one Yangon-based digital analyst who asked to remain anonymous due to the sensitivity of the issue. Facebook’s pervasiveness in Myanmar is matched only by its monopolizing influence; a 2017 poll found that 73% of people there rely on the site for news, and by some accounts, 85% of the country’s Internet traffic flows through the network.

Read more: Will the Rohingya Exodus Be Aung San Suu Kyi’s Fall From Grace?

In many ways, Myanmar epitomizes the changing narrative around Facebook: once perceived as a democratizing force, and lauded during the Arab Spring as a decentralized “people’s” platform that could unify a populace to help bring down tyrants, it has since come under fire for being vulnerable to exploitation by corrupt and despotic regimes. Virtually overnight, the social media giant provided a way to accelerate the spread of incendiary conspiracies and anti-Muslim vitriol that Buddhist nationalists previously disseminated through pamphlets or CDs.

Scrolling through Facebook in Myanmar often reveals a toxic brew of jingoistic fervor and ethnic vilification. Racial epithets, dehumanizing language, photos of dead bodies, politically charged cartoons and fabricated news articles are shared not only by hardline factions, but also by government officials – all of which fosters the impression of consensus and eclipses space for more moderate views. Monitoring groups have said that the majority of the hateful, dangerous speech targets Muslims, often portraying the minority as an existential threat to the Buddhist majority, calling for actions like boycotts, harassment and even deadly violence.

“The hate speech tends to spike at politically sensitive times such as during elections or conflicts,” says the Yangon-based analyst.

Digital researcher Raymond Serrato found evidence of such flare-ups coinciding with the military’s latest operations against the Rohingya. A Facebook group associated with a Buddhist nationalist organization known as Ma Ba Tha appears to have started posting in June 2016, and accelerated its activity the following October when an insurgent ambush triggered brutal army reprisals. Leading up to a second wave of attacks in August 2017, the number of posts again exploded with a 200% increase in interactions, according to Serrato. Scraping data from a military Facebook page revealed similarly timed activity spikes.

“It shows there was a concerted effort to influence the narrative of the conflict by the military and by Buddhist nationalists,” Serrato says.


While Facebook does not manufacture the message, it does curate the content and determine what users see in their News Feeds. Analysts say this system reinforces echo chambers, and has allowed misinformation to go viral in an environment where digital and news literacy are low. Many people in Myanmar, where until recently Internet penetration was among the lowest in the world, pay for smartphones to come preloaded with Facebook accounts and pre-liked pages.

“This is not a neutral platform. There are manipulative and falsifying elements to Facebook, ones that are under scrutiny in the U.K. and the U.S., and similarly should be in Myanmar,” says Robert Huish, an associate professor of International Development Studies at Dalhousie University. He added that “genocides require bombardments of misinformation to breed hatred” and that Facebook offers a mighty megaphone.

“The speed in which ultra-radical posts disseminated across Myanmar through Facebook was alarming,” Huish says, “and combined with a newly connected population it created a very unique scenario with devastating consequences.”

Correction: The original version of this story misstated the number of Facebook’s Burmese-speaking staff. It has some, not zero.

SPONSORED FINANCIAL CONTENT

You May Like

EDIT POST