Crime

With Alex Jones, Facebook’s Worst Demons Abroad Begin to Come Home

Chillingly similar Facebook-linked problems are becoming increasingly visible in wealthy, developed countries.

By

To Americans, Facebook’s Alex Jones problem might seem novel, even unprecedented.

When does speech become unsafe? When can it be limited? Should those decisions be up to a private company at all? And if a company shies away from acting, as Facebook did with Jones until Apple moved first, where does that leave the rest of us?

But to activists and officials in much of the developing world, both the problem and Facebook’s muddled solutions will be old news.

Before there was Alex Jones, the U.S. conspiracy theorist, there was Amith Weerasinghe, the Sri Lankan extremist who used Facebook as his personal broadcast station.

Weerasinghe leveraged Facebook’s newsfeed to spread paranoia and hatred of the country’s Muslim minority. He enjoyed near-total freedom on the platform, despite repeated pleas from activists and officials for the company to intervene, right up until his arrest on charges of inciting a riot that killed one Muslim and left many more homeless.

Before there was Weerasinghe, there was Ashin Wirathu, the Myanmar extremist, whose Facebook hoaxes incited riots in 2014. Three years later, Wirathu would contribute to a wave of Facebook-based rumors and hate speech that helped inspire widespread violence against Myanmar’s Rohingya minority.

And so on.

“Facebook doesn’t seem to get that they’re the largest news agency in the world,” Harindra Dissanayake, a Sri Lankan official, said a few days after Weerasinghe’s arrest.

The problem, he said, goes beyond a few under-regulated extremists. It also involves the algorithm-driven newsfeed that is core to the company’s business model. “They are blind to seeing the real repercussions,” Dissanayake said of Facebook’s leaders.

Developing countries’ experiences with Facebook suggest that the company, however noble its intent, has set in motion a series of problems we are only beginning to understand and that the company has proved unable or unwilling to fully address:

— Reality-distorting misinformation that can run rampant on the newsfeed, which promotes content that will reliably engage users.

— Extremism and hate speech that tap into users’ darkest impulses and polarize politics.

— Malicious actors granted near-limitless reach on one of the most sophisticated communications platforms in history, relatively unchecked by social norms or traditional gatekeepers.

— And a private company uneager to wade into contentious debates, much less pick winners and losers.

Facebook — and many Westerners — have long treated those issues as safely “over there,” meaning in countries with weaker institutions, lower literacy rates and more recent histories of racial violence. Last month, a company official, announcing new policies to restrict speech that leads to violence, referred to “a type of misinformation that is shared in certain countries.”

But chillingly similar Facebook-linked problems are becoming increasingly visible in wealthy, developed countries like the United States. So is the difficulty of solving those problems — and the consequences of Facebook’s preference for action that can be incremental, reactive and agonizingly slow.

Sri Lankan soldiers patrol following communal violence between Muslim and Buddhists, in the village of Digana near Kandy, Sri Lanka. Photo: Adam Dean/The New York Times

‘Something Bad Could Happen’

Although Facebook officials often portray the violence associated with it as new or impossible to predict, the incidents date to at least 2012. So does the pressure to more actively regulate speech on the platform.

That year, fake reports of sectarian violence went viral in India, setting off riots that killed several people and displaced thousands. Indian officials put so much pressure on Facebook to remove the posts that U.S. officials publicly intervened in the company’s defense.

Reports of Facebook-linked violence only grew in India, and as Facebook expanded to other developing countries, similar stories followed.

“I think in the back deep-deep recesses of our minds, we kind of knew something bad could happen,” Chamath Palihapitiya, a senior executive who left Facebook in 2011, said at a policy conference last year. “We have created tools that are ripping apart the social fabric of how society works.”

There were other warnings, typically from activists or civil society leaders in the developing countries where Facebook’s expansion was fastest and most obviously disruptive. But they were little heeded.

“Facebook is the platform that we could not meet with for years,” Damar Juniarto, who leads an Indonesian organization that tracks online hate groups, told me in March.

As a Facebook-based group called the Muslim Cyber Army organized increasingly elaborate real-world attacks, Juniarto said, Facebook proved unresponsive. “How are we supposed to do this?” members of his group wondered. “Is it a form? Do we email them? We want them to tell us.”

Facebook representatives eventually met with Juniarto, and the company has shut most pages associated with the Muslim Cyber Army.

Still, the episode seems to fit a pattern of Facebook waiting to respond until after a major disruption: an organized lynching, a sectarian riot, state-sponsored election meddling or, as with the so-called Pizzagate rumor pushed by Jones, a violent close call set off by misinformation.

A Corporate Regulator of Public Life

In the developing countries where such incidents seem most common, or at least most explicitly violent, Facebook simply faces little pressure to act.

In Sri Lanka, government officials spoke of the company as if it were a superpower to be feared and appeased.

Tellingly, Facebook grew more proactive in Myanmar only after the United Nations and Western organizations accused it of having played a role in spreading the hate and misinformation that contributed to acts of ethnic cleansing.

Even officials in India, a major power, struggled to get the company to listen. Indian pressure on Facebook, however, has dropped since the arrival of new government leaders who rose, in part, on a Hindu nationalist wave still prevalent on social media.

U.S. officials have far greater leverage over Facebook, as members of Congress proved when lawmakers summoned Mark Zuckerberg, its chief executive officer, to testify in April. But the Americans seem unsure what they want Facebook to do or how to compel it to act. So they, too, are not very effective at changing the company’s behavior.

More broadly, Americans seem unsure precisely how far Facebook should go in regulating speech on the platform, or what it should do about the data suggesting that misinformation is more common on the political right.

All of which comes through in Facebook’s hesitation about shutting down Jones’ page, despite his long record of demonstrable falsehoods that have real-world consequences. U.S. commitment to free speech is unusually tied into the country’s sense of itself. Still, the dilemma here is not so different from those government officials and Facebook itself face in places like Indonesia or Sri Lanka.

So while few are comfortable — perhaps Facebook least of all — with a private company acting as a vastly powerful regulator of public speech, even fewer seem willing to step in and take on the task themselves.

Ashin Wirathu, an ultranationalist Buddhist monk, prays amongst his followers before he delivers a sermon at the Thein Taung Monastery in Kalaw, Myanmar. Photo: Adam Dean/The New York Times

Move Fast and Break Things

There are growing indications Facebook’s problems in rich countries may go beyond misinformation to do the kind of harm developing countries have experienced.

Karolin Schwarz, who runs a Berlin-based organization that tracks social media misinformation, said she believed Facebook-based rumors about refugees could be fueling the spate of hate crimes against them.

“I think it does something to their sense of community,” she said. “These things, if they reach thousands of people, you cannot get it back.”

The platform has grown so powerful, so quickly, that we are still struggling to understand its influence. Social scientists regularly discover new ways that Facebook alters the societies where it operates: a link to hate crimes, a rise in extremism, a distortion of social norms.

After all, Jones, for all his demagogic skills, was tapping into misinformation and paranoia already on the platform.

In Germany, Gerhard Pauli, a state prosecutor based in Hagen, told me last month about a local firefighter trainee who had grown so fearful of refugees that he attempted to burn down a local refugee group house. “I’m quite sure that social media made it worse,” he said.

Pauli said that his office spent more and more time tracking rumors and hate speech on Facebook, and that it seemed to rise in advance of violence, as when the mayor of nearby Altena was stabbed last year.

Although Germany is a major economy with some of the world’s strictest social media regulations, Pauli had only somewhat more success with Facebook than his peers in the developing world.

“In the beginning, they did nothing,” he said. “They would say, ‘You have no jurisdiction over us.’ In the last few years, they are more helpful, especially in cases of child abuse.”

But, in other matters, the company remains skittish, Hagen said. “They do have a lot of information, but they don’t want to lose users,” he said.

The prosecutor has grown especially concerned, he said, about social media rumors — say, a stranger near a school — that could spin ordinarily self-contained Germans into violence. Not so unlike in Sri Lanka or India.

“We have lots of situations where somebody saw somebody outside the kindergarten,” he said. “Within five minutes it’s spreading, and from post to post, it gets worse. It takes two hours and then you have some lynch mob on the street.”

Amanda Taub contributed reporting.(The Interpreter)

© New York Times 2018

Leave a Reply

Your email address will not be published. Required fields are marked *