In the days after the shocking yet not entirely surprising election of Donald Trump to the office of the American presidency, we’re performing our standard triage and diagnostic routine. One of the running themes is that Facebook and the #FakeNewsSites problem are to blame for false knowledge and therefore Trump’s election win. This is a specious argument.
Much of the coverage and outrage has been directed toward social media, its echo chambers, and specifically those of the Facebook platform. While, to be sure, much of the fake or inaccurate news is found and circulated on Facebook, Facebook is not a news outlet; it is a communication medium to be utilized as its users so choose. It is not the job of Facebook’s employees, or its algorithms, to edit or censor the content that is shared; in fact it would be more detrimental to do so. This is for two very good reasons:
One, either human editors, or artificial intelligence editors, by removing one item or another will appear to introduce bias into the system. The group who’s content is being removed or edited will feel targeted by the platform and claim, rightly or wrongly, it is biased against their cause. Even if the content is vetted and found to be true or false.
Two, censorship in any form is bad for the national discourse.
So rather than blaming Facebook or other platforms for the trouble in which we find ourselves, let’s give credit where credit is due: The American people.
This comes down to two very important concepts that our society has been turning its back on, in the age of social media: Confirmation bias and epistemology.
Explained by David McRaney, the You Are Not So Smart blogger and author of “You Are Now Less Dumb: How to Conquer Mob Mentality, How to Buy Happiness, and All the Other Ways to Outsmart Yourself,” confirmation bias is the misconception that “your opinions are the result of years of rational, objective analysis,” and that the truth is that “your opinions are the result of years of paying attention to information which confirmed what you believed while ignoring information which challenged your preconceived notions.” Or, more precisely: The tendency to process information by looking for, or interpreting, information that is consistent with one’s existing beliefs.
If we find a piece of content that says that Donald Trump is clueless, or that Hillary Clinton belongs in prison, we accept the one because it reinforces our like for one candidate over the other, and discard the negative item as some falsehood generated by the opposing party to discredit your candidate. We don’t care about the information or what it says, as long as it reinforces how we feel.
That brings us to epistemology, “the study or a theory of the nature and grounds of knowledge especially with reference to its limits and validity,” a branch of philosophy aptly named from the Greek, meaning “knowledge dscourse.” This is a concept that has existed since the 16th century and very likely conveniently ignored in political campaigns ever since, perhaps because it’s just easier to believe and propagate than it is to read and validate.
In fact, a recent Pew Research Center survey called the American Trends Panel asked if the public prefers that the news media present facts without interpretation. Overwhelmingly, 59 percent of those posed the question preferred facts without interpretation, and among registered voters, 50 percent of Clinton supporters, and 71 percent of Trump supporters preferred no interpretation. While those numbers may seem incredible, the telling result is that 81 percent of registered voters disagree on what the facts actually are. Aren’t facts just facts? Yes, they are, but our biases and distrust of intellectual sources say otherwise.
Does Facebook create echo chambers on both sides of the political spectrum? No. Facebook and other social media only serve to provide a high-speed amplifier of what already exists in our society, especially to those who enjoy the communal effect of sharing information with others in their personal circles. Facebook does give them a wide and instant audience.
In a 2012 study published in the journal Computers in Human Behavior, computer scientists Chei Sian Ma, and Long Ma said, “… we also establish that status seeking has a significant influence on prior content sharing experience indicating that the experiential factor may be a possible mediator between gratifications and news sharing intention.”
Or, in other words, it’s fun to share something and get congratulatory high-fives from your like-minded friends. Facebook does make that activity almost instantaneous. Sharing news, or fake news, and being liked for doing so feels good. Never mind the ramifications on the accuracy of cultural or political discourse.
During his final press conference in Berlin with Angela Merkel, President Obama puts this as succinctly as it could possibly be said: “If we are not serious about facts, and what’s true and what’s not ... if we can’t discriminate between serious arguments and propaganda, then we have problems.”
This article originally appeared on Recode.net.