/cdn.vox-cdn.com/uploads/chorus_image/image/58609931/shutterstock_200145485.0.1518029059.jpg)
Over the past few months, a highly disturbing video showing a man committing a sex act on a young girl has been widely distributed via Facebook Messenger — ostensibly in the name of preventing the spread of child pornography.
That’s right: Thousands of people have voluntarily shared child pornography on Facebook — most of them presumably in an effort to catch the man shown in the video, as the video is accompanied by text urging people to share it so he could be identified. They did this apparently heedless of the fact that a) distributing child porn is illegal, and that b) sharing child porn in an effort to stop child porn is still spreading child porn.
Local authorities have described the spread of the video as “a nationwide epidemic,” with one Michigan sheriff’s office issuing alerts to remind the public that “even with good intentions,” spreading child porn is bad. Buzzfeed reported Tuesday that Facebook had become aware of the video on February 2 and added it to a photo-recognition service in an attempt to block it from further distribution.
Of course, such efforts won’t curb the likelihood of this kind of thing happening again, with a different image or video spread by Facebook users because it comes with a message — a message that may be concern trolling — telling them that spreading the video is for the public good.
So how did presumably logical humans get roped into this mess? The video’s spread can likely be attributed to a combination of three different scenarios:
1) Facebook users genuinely wanted to help, so even though the video was illegal, they chose to overlook its content and pass it on anyway. In this scenario, the Facebook users believed the greater good of catching a criminal outweighed the illicit and harmful content.
2) Facebook users who got duped into watching the video decided to dupe other people in kind, so they passed the video along. In this scenario, users behaved in true troll fashion, getting played and then playing the game in turn.
3) Facebook users didn’t look closely enough at the video to realize it contained child porn, but wanted to pass it on anyway in case someone else watched it and recognized the perpetrator. In this scenario, the pressure to “signal boost” might have overridden critical thinking.
Taken together, these scenarios reflect the range of obstacles to establishing context and knowing what sources to rely on when dealing with viral content on the internet. Regardless of which scenario carried the day, everyone who spread the video failed to do their due diligence in establishing what they were sharing and what the consequences might be.
Oh, and just in case you forgot: These are the same users that Facebook is relying on to responsibly crowdsource the news. But at least the problem of viral child porn on Facebook Messenger seems to be on the wane — this time.