/cdn.vox-cdn.com/uploads/chorus_image/image/62931653/GettyImages_691198000.0.jpg)
WhatsApp has a plan to help fight its dangerous fake news problem: It’s making the service less viral.
More specifically, WhatsApp is changing the feature that lets users “forward” a message to others on the app. It used to be that WhatsApp users could forward a message to up to 256 different conversations at once, and each of those conversations could have as many as 256 people in them. That meant that, theoretically, someone could forward a false news article or message to tens of thousands of people at one time.
On Monday, WhatsApp announced that people can now only forward a message to five different chat conversations at a time, cutting down the potential audience for each individual user in the chain.
It’s a simple idea: If the product isn’t built to share things broadly, it’ll be harder for false news to go viral. WhatsApp says that since July, when it started testing a 20-conversation limit, there has been “a 25 percent reduction of forwarded messages” on the service.
From our calculations, the new limit means that one person could now theoretically reach 1,280 accounts by forwarding a message — down from more than 65,000 originally.
Fewer forwarded messages doesn’t actually mean Facebook-owned WhatsApp has improved its misinformation problem, which is so serious that people have died as a result. Part of WhatsApp’s problem is that all its messages are encrypted, making it harder for the company to find misinformation and remove it. (Facebook is considering encrypting all of its messaging services, according to the New York Times.)
But WhatsApp’s attack on virality is an intriguing idea that seems as though it could be applied to other networks suffering from misinformation problems.
Could Facebook and Twitter also fix their issues by killing virality?
The obvious answer is yes. Facebook has a share button, Twitter has a retweet button, and both services could dramatically curb the spread of misinformation by getting rid of those features or limiting their power. Imagine if news — true or not — couldn’t spread with the single click of a button?
But it’s not very likely that either will choose that route. The reality is that while WhatsApp is suffering from the same problems as Facebook and Twitter, their products and business models are completely different.
WhatsApp is a private messaging service with no business model. Trying to limit virality doesn’t dramatically impact the user experience — 90 percent of WhatsApp messages sent are between just two people — and won’t hurt WhatsApp’s (nonexistent) business.
But Facebook and Twitter are built for virality. The products can take a single post and reach an audience much larger than the group of people who follow the post’s creator. It’s one of the main reasons media companies and advertisers spend so much time and money sharing on these services. The ability to blast info to a large audience is part of what drives Facebook’s and Twitter’s value to advertisers and brands that want to build a relationship with their customers.
Facebook has already started to use its algorithm to cut down on the virality of misinformation. When accounts share stories that have been disputed by the company’s fact-checkers, Facebook can use its algorithm to limit that account’s reach. It has been using the algorithm to push down clickbait and spam for years. YouTube just announced Friday that it is also tweaking its algorithms to stop recommending videos that promote false news or conspiracy theories.
But reactively cutting the distribution of a post through an algorithm is different from trying to reduce the service’s viral capabilities. Algorithms can help clean up a mess. Reducing these products’ viral potential could stop the mess before it ever happens. It could also hurt Facebook’s and Twitter’s business models and cut into a core reason for why people use those services.
Which is all to say that WhatsApp’s plan to hurt virality is an interesting solution to a serious problem. It just might not be a universal one.
This article originally appeared on Recode.net.