Life

WhatsApp Is Trying To Clamp Down On Viral Misinformation With A Messaging Limit

In an attempt to combat the viral spread of false information, WhatsApp is limiting the number of times a user can forward a message, to five.

By

Over the past two years, at least two dozen people in India have been killed in violent mobs incited by rumors on WhatsApp, the global messaging service with 1.5 billion users. The Facebook-owned app has also played host to disinformation campaigns in several other countries, a recent report found, including in Brazil, Pakistan and Mexico. And unlike disruptive campaigns on social media platforms such as Facebook and Twitter, the messages on WhatsApp are private and encrypted, traveling from person to person and to larger groups with fewer ways for the company or outside experts to see where inflammatory messages originate.

In an attempt to combat the viral spread of false information, WhatsApp is limiting the number of times a user can forward a message, to five. The new global limit comes after the company tested a cap on forwarded messages in July, restricting users in India to five message forwards and all other users to 20 forwards. India is home to the highest number of forwarded messages, photos and videos, exceeding every other country’s, WhatsApp says. The previous limit set in the country came after a surge in mob violence fueled by rumors on the app.

In an updated blog post Monday, WhatsApp said it evaluated the test restrictions over the past six months and found that the cap “significantly reduced forwarded messages around the world.”

“Starting today, all users on the latest versions of WhatsApp can now forward to only five chats at once, which will help keep WhatsApp focused on private messaging with close contacts,” the company said. “We’ll continue to listen to user feedback about their experience, and over time, look for new ways of addressing viral content.”

The change in policy is the latest effort by tech giants to curb the spread of misinformation. But it also highlights the challenges particular to WhatsApp, whose messaging system is designed to be obscured from public view.

Last week, Facebook announced it had removed more than 300 pages masquerading as independent news sites or information hubs when, in fact, they were part of an online network allowing Russian state-owned media to reach users in secret. The Russian campaign to engage in “coordinated inauthentic behavior” is prohibited on the site, but Facebook still faces renewed efforts by bad actors to manipulate its platform.

By removing networks of inauthentic accounts, Facebook has grappled with the aftermath of Russian operatives targeting the U.S. presidential election in 2016. But WhatsApp’s misinformation problems are tied more to how users rapidly share false information.

By messaging groups, which can be as large as 256 people, WhatsApp users can share misleading stories or memes instantly. They can then get forwarded to even more users without a check on the spread of texts or images. WhatsApp’s security staff can read messages when a user specifically reports problematic content, but the closed nature of the app makes it harder to stem the flow of misinformation or to study how viral messages spread.

WhatsApp said it believes that restricting message forwarding will help keep the app closer to its intended design as a private messaging app. But, as other prominent cases of misinformation have shown, how a company views its own messaging platform and how people wish to use it may not always align.

(c) 2019, The Washington Post

Leave a Reply

Your email address will not be published. Required fields are marked *