clock menu more-arrow no yes mobile

Filed under:

WhatsApp will drastically limit forwarding across the globe to stop the spread of fake news, following violence in India and Myanmar

The social messaging service has been blamed for its role in spreading dangerous and inaccurate information.

Facebook CEO Mark Zuckerberg.
Facebook CEO Mark Zuckerberg
Win McNamee / Getty

Facebook-owned WhatsApp is making changes it hopes will drastically curtail the ability of users to spread fake news via its social messaging service by making it harder for people to forward messages to very large groups.

The social networking giant has come under intense criticism for the role its popular tools have played in amplifying violence, which has happened in Myanmar and also in India. This move obviously favors safety over virality, although it might not assuage detractors who believe Facebook’s services have created damage in real life.

WhatsApp’s plan, outlined in a blog post Thursday night, is to limit the number of people to whom users can forward messages, theoretically making it harder for fake information to go viral. Globally, users will now be able to forward messages to just 20 people, although that will be limited to only five in India. The previous limit was over 250.

WhatsApp is also getting rid of the “quick forward” feature in India, a button next to multimedia messages that made photos and videos even faster to pass along. WhatsApp is calling these changes a “test.”

“We’re horrified by the violence in India, and we’ve announced a number of different product changes to help address these issues,” a company spokesperson told Recode. “It’s a challenge which requires an action by civil society, government and tech companies.”

Limiting the rate at which people can forward messages won’t solve the problem, of course, but WhatsApp hopes it will slow down the viral impact that social networks have become known for.

Facebook’s fake news problem isn’t just causing violence in India. The rapid spread of misinformation online, often over WhatsApp, has also resulted in killings in countries like Myanmar and Sri Lanka.

On Recode Decode this week, Facebook CEO Mark Zuckerberg said that the company would be updating its stance on fake news that perpetuates violence. Instead of simply trying to hide those posts on the service, as it does with other posts promoting false news, he said Facebook plans to take that kind of content down entirely.

When asked several times how he personally felt knowing that Facebook, his creation, was leading to such violence, Zuckerberg declined to comment, noting his first preference was to find a solution.

“I mean, my emotion is feeling a deep sense of responsibility to try to fix the problem,” he said. “That’s the most productive stance.”

Zuckerberg still managed to attract controversy over the issue of fake news and hoaxes this week in that same podcast, noting that he favored continuing to allow sites like those of Holocaust deniers to stay on Facebook, even if they were inaccurate. He got into even more trouble when he observed that such posts might not be made “intentionally.” He later clarified his comments.

This article originally appeared on Recode.net.