clock menu more-arrow no yes mobile

Filed under:

Facebook and Twitter aren’t changing their terms of service following violent protests in Charlottesville

But Facebook is deleting hateful posts related to Charlottesville.

Organizers Of Saturday's Alt Right Rally In Charlottesville, Virginia Hold News Conference Chip Somodevilla / Getty

This weekend’s violent protests in Charlottesville, Va., have not changed how social media companies like Facebook and Twitter will handle violent or racist commentary on their networks.

Facebook and Twitter are not updating their respective user guidelines and safety policies, though other tech companies, including Airbnb and web hosting company GoDaddy, have taken more public stands against white nationalist groups.

Violent clashes between white nationalists and people protesting the racist groups led to the death of Heather Heyer, a young woman demonstrating against the white nationals. Attorney General Jeff Sessions on Monday called the weekend assaults “domestic terrorism.”

Google and GoDaddy banned the web domain of neo-Nazi site The Daily Stormer, and Airbnb dropped accounts of rally attendees. Facebook and Twitter have not changed their policies, but Facebook said it is “actively removing any posts that glorify the horrendous act committed in Charlottesville.” Twitter pointed us to their existing terms and declined to comment further. Facebook sent a formal statement condemning “hate speech or praise of terrorist acts or hate crimes” when asked if it was planning to change its guidelines.

Both sites already condemn hate groups. Facebook explicitly prohibits “organized hate groups.” Twitter’s rules ban accounts that promote or incite “violence against or directly attack or threaten other people on the basis of race, ethnicity, national origin, sexual orientation, gender, gender identity, religious affiliation, age, disability, or disease.”

It’s unclear if that would include white nationalists or neo-Nazis.

What’s interesting is that Facebook’s executives are condemning white nationalist activities despite the fact that Facebook will let its members and groups continue to operate. Facebook’s VP of ads, Andrew “Boz” Bosworth, retweeted others preaching tolerance and a video about not falling for fascism.

COO Sheryl Sandberg was more explicit: “Every generation has to be vigilant in fighting against the type of bigotry and hatred that was displayed by the white supremacists in Charlottesville,” she wrote Monday. “Along with millions of others, I was so heartbroken this weekend.”

But despite these beliefs, Facebook and Twitter are routinely weaponized and used to help plan or promote these kinds of rallies or protests, and many believe that the companies do not do enough to help squash them.

Update: A Facebook spokesperson confirmed that the company did remove an event page for this weekend’s “Unite the Right” rally, but it had already been up a month, according to Business Insider.

In Europe earlier this year, Twitter failed to meet the E.U.’s standards for removing hate speech on its platform, and a study found that Twitter took down less than 40 percent of what the European Commission deemed to be “hate speech.” (Though the company claims it is doing a much better job than it has in the past.)

In Germany, regulators are considering imposing fines on social media companies that don’t act quickly enough to remove or block hate speech and terrorist propaganda online.

But many users, especially prominent figures among the alt-right movement, have made an art of walking the line between violating the rules and exercising free speech. Facebook and Twitter allow accounts and conversation that they describe as “controversial” so long as it does not promote or illicit violence.

That’s why former KKK leader David Duke is allowed on Twitter, where he shared video from the protests and tweeted about “anti White hatred.” It’s also why groups like “White Nationalists United,” with a group objective of “putting the white race first,” are still allowed on Facebook.

(Update: Facebook reached out to clarify that it does not allow white supremacist groups to operate on its platform and has removed the “White Nationalists United” group that we linked to earlier.)

These types of users and groups are not explicitly promoting violence. But they’re also the kinds of groups that other tech companies like GoDaddy and Airbnb have stood up to. Facebook and Twitter have not. Perhaps after the next tragedy.

This article originally appeared on