clock menu more-arrow no yes

Facebook’s tremendous size was its greatest asset. Now it may be its biggest problem.

The viral video of a shooting in New Zealand offered a grim reminder of tech companies’ vast reach.

Two people stand at a memorial of piled bouquets of flowers in honor of the shooting victims in mosques in Christchurch, New Zealand.
A tribute to those killed in Christchurch, New Zealand.
Carl Court / Getty Images

Facebook COO Sheryl Sandberg has a phrase she likes to use on company earnings calls as a way to pitch Facebook’s business: Advertisers, Sandberg says, can reach a Super Bowl-size audience any day of the year.

For years, Facebook’s size and scale have been considered a real positive. Having more than 2 billion monthly users is great for business. Adding tens of millions of people to your app every quarter is a tremendous story to tell investors, advertisers, and media companies.

But when a gunman opened fire at a New Zealand mosque late last week, broadcasting video of the shooting live on Facebook for anyone to see, the platform’s enormous size became a complete and total liability. Suddenly that Super Bowl-size audience had access to something Facebook didn’t want them to see, and the company couldn’t take down fast enough the more than 1 million copies of the video uploaded by users in the next 24 hours.

The New Zealand shooting, which left at least 50 people dead, served as a horrendous reminder that Facebook’s scale — and YouTube’s and Twitter’s — are a serious problem. Facebook said its technology was able to detect and block 80 percent of the videos people uploaded of the shooting.

The problem is the 20 percent of videos that got through the net totaled some 300,000. In just 24 hours, Facebook and Instagram users tried to upload video of the shooting 1.5 million times.

At YouTube, the situation wasn’t any better. In a statement, a Google spokesperson said the video uploads were “unprecedented both in scale and speed, at times as fast as a new upload every second.” YouTube’s head of product told the Washington Post, “Every time a tragedy like this happens we learn something new, and in this case it was the unprecedented volume [of uploads].”

While the original video reached only around 4,000 people, according to Facebook, it was online long enough for copies to start spreading to other internet forums, like the messaging forum 8chan. From there, people started uploading versions of the shooting video back to Facebook and YouTube, and the tech platforms simply couldn’t keep up with the speed or volume.

Facebook has technology that can match video content to an original for quicker automatic removal. But the technology isn’t foolproof, as we learned this past week. Facebook had trouble removing all videos because people uploaded different versions of the original — for example, videos of the original recorded off a separate screen, or watermarked — that didn’t fully match.

In YouTube’s case, the company was so overwhelmed with uploads it eliminated human review for any videos flagged by its algorithms that had to do with the shooting, knowing it might take down legitimate or unrelated videos by mistake. The rules intended to provide a more thoughtful review process had to be tossed out the window.

For the past year, we’ve been talking about Facebook’s and Google’s tremendous size for other reasons: There are some, like Sen. Elizabeth Warren, who believe these companies are too dominant and should be broken up.

But their scale is not just a business problem — it can be a societal problem, too. Facebook and Google are not responsible for what the New Zealand gunman did last Friday, but it’s also not fair to ignore what service they provide when used by bad people: A free distribution mechanism for hatred and terror.

Unfortunately, this feels like a problem without a solution. Facebook and YouTube and Twitter clearly don’t yet have the technology to instantly clear their services of bad or troubling content. Even if they did, there is no way to stop content altogether without a system that vets posts before they go up — an idea that has been floated in India but is not likely to catch on here in the United States.

Instead, Facebook and YouTube and Twitter are necessarily reactive. And yes, they’ll learn to react quicker and better as time goes on and technologies improve, but it will always be a reaction.

The problem may soon get even tougher. Facebook CEO Mark Zuckerberg recently unveiled Facebook’s plan to shift toward private, encrypted messaging. If more content is shared privately instead of in a public, algorithm-fueled feed, it’s possible that videos like the one from New Zealand won’t find the kind of oxygen they need to go viral on Facebook in the future.

But encrypting content also means it will be harder for Facebook to find and remove videos like the one from the New Zealand shooter with any level of success. Someday, videos like these may not appear in your Facebook feed — they may appear in your private messaging inbox instead.

That’s not exactly the Super Bowl any of us had in mind.

This article originally appeared on Recode.net.