Misinformation distributed by social platforms like Facebook has almost become old news in the United States, thanks to all the attention focused on Russian troll armies trying to influence the 2016 presidential election. But in some countries such as Myanmar, “fake news” doesn’t just interfere with people’s views about who to vote for—it leads to people being arrested, jailed, and in some cases even killed. And Facebook doesn’t seem to be doing a lot about it.
Southeast Asia is one place where the social network is enabling the fomenting of ethnic and political tensions in dangerous ways, according to a number of journalists who cover the region. This effect can be seen in countries like Thailand and Cambodia, but it has become increasingly severe in Myanmar, where the Rohingya people are being persecuted, driven from their homes, and in some cases raped and killed.
ICYMI: “We’re the dominant publication in the most populous, wealthiest state in the country”
“As complicated as Facebook’s impacts on the politics of the United States are, the impact in Asia may be even more tricky, the consequences more severe, and the ecosystem less examined, both by Facebook and most people in the US,” says Christina Larson, who has written about the region for a number of outlets including Foreign Policy and The Atlantic.
As the situation has escalated over the past six months, journalists working in Myanmar say they have seen waves of Facebook-based misinformation and propaganda aimed at fueling anti-Rohingya fervor, including fabricated reports that families were setting fire to their own homes in an attempt to generate sympathy.
More than 600,000 people have been forced from their homes, and thousands have died in the process, in what the UN has called a “textbook example of ethnic cleansing.”
One of the main sources of anti-Rohingya propaganda is Ma Ba Tha, a group of radical Buddhist monks who have been preaching that the Rohingya are trying to overrun the country and make everyone Muslim. The leader of the group, Ashin Wirathu, has been banned from preaching, but he has been able to spread his message thanks to an orchestrated Facebook campaign.
Larson and others say the problem is compounded by the fact that a majority of Myanmar residents rely on Facebook for their news. And yet, the level of media literacy is low, primarily because smartphones and social media are still relatively new.
Until 2014, the digital SIM cards required to use smartphones were prohibitively expensive because they were only available from the country’s government-controlled telecom carrier. After the industry opened up, cheap smartphones and $1 SIM cards flooded the market, available from every street vendor—and almost all had Facebook installed by default.
“Facebook has basically become the way that people do everything,” says Paul Mozur, a New York Times reporter who covers Myanmar. “It replaces newspapers, it displaces outreach campaigns by NGOs and other agencies trying to reach people especially in remote areas, it replaces just about everything.”
The leader of an anti-Rohingya group has been able to spread his message thanks to an orchestrated Facebook campaign.
Wirathu, the leader of the anti-Rohingya movement, used to print out paper pamphlets and flyers to spread his incendiary messages, Mozur says, but now he just posts fake images on Facebook and gets 100 times the reach.
This is more or less the same process that propagandists of all political stripes use in the US use as well, but in most cases their work doesn’t lead to the kind of large-scale violence that the Rohingya are being subjected to in Myanmar.
Many of those who have been thrust into this new world of smartphones and social networks in Myanmar “just aren’t used to the level of misinformation or disinformation that’s happening on Facebook,” says Mozur. “Suddenly they’re subject to the full force of an information war coming out of Yangon, orchestrated by much more sophisticated sources, and it’s easy for them to become pawns in that war.”
And what is Facebook doing to help? Not much, some observers say. The social network has relationships with non-government agencies, but only a couple of actual staffers on the ground. “It’s become a bit like an absentee landlord in Southeast Asia,” according to Phil Robertson, deputy director of Human Rights Watch in Asia.
A Facebook spokesman told CJR the company works hard to keep hate speech and content that incites violence off the platform, that it is working with nonprofit groups to raise awareness of its community standards, and that it has local-language pages that offer tips on safety and security. Facebook says it is also trying to understand the nuances of hate speech in Myanmar.
RELATED: The most-read stories since Trump’s election win might surprise you
Those working in the region, however, say the company has too few people working on the problem inside Myanmar and therefore doesn’t understand what is required, and can’t move as quickly to take action when content gets out of control.
At one point, Mozur says, messages were spreading on Facebook Messenger that said Muslims were planning an attack on 9/11, and at the same time a separate chain letter also circulating on Messenger said that Buddhists were planning to attack on the same day.
“I don’t know who was behind those messages, it could have been like four people, but it literally brought the country to a standstill,” he says. “A lot of times these rifts are there already, and so in a certain sense I guess Facebook is a mirror, holding itself up to the differences in society. But social media can also become a real catalyst for the violence.”
Larson says there’s a debate to be had about how to define hate speech, “but what I would consider dangerous speech is advocating that the Rohingya need to leave Myanmar, and sharing doctored images of them supposedly burning their own houses to create a media spectacle.”
In a way, she says, these images—which were liked and shared tens of thousands of times—”gave cover for military action and human rights violations, including violence and rape. You can’t say social media kills people. . . but certainly social media shaped public opinion in a way that seems to have played a part in the escalation of violence against the Rohingya.”
Facebook’s approach to countries like Myanmar and others in the region often strikes those on the ground as not just out of touch but actively cavalier. In recent experiments, for example, users in countries like Cambodia and Slovakia had news articles moved to a completely separate feed, which local nonprofit groups and media outlets say significantly impacted their ability to reach people with crucial information.
It’s one thing to tread carefully around issues like free speech, Larson says, “but if you’re going to run A/B testing, where you change an algorithm and see what you think consumers like best, for god’s sake, stick to stable democracies. Don’t pick a place where there’s an authoritarian regime that is busy locking up opposition leaders, and Facebook is a primary way that activists communicate about their government.”
In many ways, Myanmar is an example of the future Mark Zuckerberg seems to want: A country in which most people are connected through the social network and get virtually all of their news from it. And yet, the outcome of that vision isn’t a utopia, it’s a dystopia—a world where ethnic and cultural tensions are inflamed and weaponized. And Facebook’s response looks inadequate to the dangers it has helped unleash.
Mathew Ingram was CJR’s longtime chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.