Facebook said it uncovered new efforts by groups linked to Iran and Russia to dupe the company’s users. It’s a reminder that the company is going to be under enormous pressure in the run-up to the 2020 election.
Facebook says it found evidence that three groups connected to Iran and another one tied to Russia’s Internet Research Agency — the group that spearheaded Russia’s attempt to interfere in the 2016 elections — created “networks of accounts to mislead others about who they were and what they were doing.” Facebook says it has booted the groups from Facebook and Instagram for violating its rules against “coordinated inauthentic behavior.”
Facebook is disclosing the attacks at the same time it is announcing a series of moves it says will help “protect the 2020 US elections.” Many of these changes have to do with giving users more information about the posts and ads they see on Facebook and Instagram.
None of the changes, however, address the company’s decision to let politicians make false claims in ads — a much-derided policy that CEO Mark Zuckerberg tried to defend in a speech last week.
Facebook has made disclosures about attacks in the past, including efforts by other Iranian groups, as well as Chinese groups. The four groups Facebook banned today seem to follow the same general playbook: They attempted to pass off fake users as real users and to promote stories and ideas that are either meant to promote certain ideologies or to simply sow division and raise distrust of the political process.
In this case, Facebook is presenting the crackdown on these actors as sort-of good news, because it says the attackers didn’t get very far before the company detected them and kicked them off. In the case of the Russian group, Facebook’s security team says that new systems they’ve put in place made the Russians work very hard for little effect.
All of this seems a little reminiscent of press conferences where police show off guns and drugs they’ve netted in a crackdown, to show that their efforts are working. It’s also reasonable to ask if Facebook would rather have people discussing their anti-corruption efforts than other Facebook stories, like the continuing debate about its just-about-anything-goes-if-you’re-a-politician stance.
If you’re a regular Facebook or Instagram user, none of this is likely to show up in your day-to-day use. One possible exception: Facebook is going to make it much clearer when one of your friends or someone you follow is sharing a fake story. Rather than labeling the item as “disputed” or some other tentative euphemism as it has in the past, Facebook is now going to call fake news fake. Well, “False Information,” to be specific:
And Facebook is going to give Instagram users the same pop-up warning it already gives Facebook users who try to post bogus items:
If you’re someone who does want to spend time trying to figure out what you’re consuming on Facebook, the company says it will help you do that, too, by adding more information into its political ad tracker and providing more granular information about who owns individual pages.
It would have been very nice to have this stuff back in 2016, but it’s good to have it now. What we don’t know is whether any of this will help Facebook users who are susceptible to the influence of bogus news to begin with.