Facebook suspended Infowars founder Alex Jones on Thursday for uploading a number of videos that violated the platform’s content policies.
Facebook did not, however, suspend the actual Infowars Page, or the official Alex Jones Page, or any other Pages where those inappropriate videos were posted.
So how does that work? Why was Jones — a conspiracy theorist who continues to maintain that the Sandy Hook shooting tragedy was a hoax — suspended for a month, but his Pages, where his conspiracy theories are posted, allowed to remain?
It’s a good question, and likely one that we’ll be chewing over in frustration for some time, given the role social platforms like Facebook and Twitter are now playing in policing the internet’s content. In Facebook’s case, understanding the Jones/Infowars issue means understanding the distinction between a Facebook user profile and a Facebook Page.
Alex Jones’s personal user profile is an admin for a number of Infowars-related Pages, which means he has permission to post or share videos to those Pages. Each time Jones shares a post that violates Facebook’s policies to one of those Pages, both Jones’s user profile and the Page receive some kind of “strike” against their record — essentially, a warning from Facebook to take the post down and cut it out.
But the reason Jones was suspended, but his Pages are still up, is that Jones posted the same bad content to multiple pages, drawing multiple strikes against his record. So if Jones shared three bad videos to three different pages, for example, he would receive nine total strikes, whereas each Page would receive just three.
Make sense? It’s Facebook’s attempt to punish individual bad actors sharing the content, but not necessarily the Page that’s hosting it.
That process gets murkier, though, at least from the outside. Facebook doesn’t share how many “strikes” a user or Page has to get before it’s suspended, for example. Facebook is afraid that sharing that number would help bad actors game the system. (It would.)
Facebook also doesn’t share how long strikes exist on a user’s record. Strikes are not permanent, which means they’re eventually wiped clean. But again, Facebook is afraid that sharing that timeframe would help people game the system. (Again, it would.)
There’s also this: Not all strikes are created equal. If you post child exploitative content, for example, just one strike means your account is banned for good. In Jones’s case, some of the videos that were removed violated Facebook’s rules around bullying, and included attacks on people based on their religion or gender identity, a Facebook spokesperson said. In that instance, multiple strikes against Jones lead to a temporary suspension.
Facebook’s policies are nuanced because policing the internet is nuanced. But the policies have also been confusing, and for many people, frustrating, especially in a world where misinformation can spread so quickly. It’s one of the reasons Facebook recently published the rules in their entirety.
Infowars in particular has created a lot of drama for Facebook lately. When company execs were asked why Infowars, which has a long history of spreading false information, is allowed on the site given Facebook’s pledge to stop the spread of misinformation, execs described it as a “free speech” issue. When Recode’s Kara Swisher asked CEO Mark Zuckerberg to elaborate on a recent episode of her podcast, Zuckerberg said he didn’t believe fake news should actually be removed from Facebook unless it incites violence. Even deeply offensive stuff like Holocaust denials should remain, he added.
“There are things that different people get wrong — I don’t think that they’re intentionally getting it wrong,” Zuckerberg said. “It’s hard to impugn intent and to understand the intent.” (Though Swisher pushed back on his use of the word “intentionally” — noting that Holocaust deniers did intend to get it wrong, Zuckerberg’s Holocaust reference got him in even more trouble.)
All of this drama has been as polarizing online as you would have expected. “Some very fine pages on both sides,” quipped the New York Times’ Kevin Roose on Twitter, mocking Facebook’s defense of Infowars a few weeks back. The Guardian’s Julia Carrie Wong summed up Facebook’s quandary with a good tweet: “The fun thing about alex jones and infowars on facebook is that if he just displayed a female nipple fb would shut him down in a snap,” she wrote, “but since he’s doing something much more insidious and hurtful their hands are tied by their ‘principles.’”
Alex Jones, meanwhile, says that he’s the victim of a broader media conspiracy to “de-platform” Infowars and other conservative voices. “It’s not about Alex Jones. It’s about the midterms and beyond,” Jones said from a Facebook livestream Friday. (Yes, even though Jones is suspended from Facebook, he can still appear live on videos posted by one of his Pages.) “They’re getting ready to launch civil unrest nationwide.” That argument was heightened by the fact that YouTube also punished Jones and Infowars last week.
Everything set aside, all of these issues come from the same challenge: Deciding what should be allowed, and what shouldn’t be allowed, on services like Facebook and Twitter. Everyone can agree on some elements of that — child pornography, for example, is an obvious no-no — but what about so-called fake news? What if it’s shared innocently? What if it’s satirical?
Facebook is at the center of this problem. Given Alex Jones and Infowars are also now involved, you can bet it won’t go away anytime soon.
This article originally appeared on Recode.net.