/cdn.vox-cdn.com/uploads/chorus_image/image/47145818/GettyImages-52232221.0.jpg)
Are you biased about politics? Do you ignore evidence that's inconvenient for your viewpoint, and get overly excited at evidence that supports it?
You may not think so — most of us like to think we hold reasonable, well-grounded beliefs. But there’s mounting evidence from political psychology that most people struggle to think clearly and objectively when it comes to political issues. You may not think that applies to you, but it’s surprisingly difficult to spot one's own biases. In fact, there’s even a name for the bias of seeing everyone else’s biases while failing to acknowledge one’s own — the "bias blind spot."
To help combat this blind spot, Stefan Schubert, a researcher at the London School of Economics, has developed a test in collaboration with ClearerThinking.org that lets you see how biased you really are. You can take it below, or at ClearerThinking.org's website.
Click here to skip ahead to after the quiz.
The test asks you a number of questions about your opinions and knowledge on various political issues, and uses these answers to give you a score of how politically biased you are in different areas. In the introduction to the test, Schubert explains: "The first step in reducing a bias is noticing that it’s there. We produced this test to help people spot, and ultimately overcome, any blind spots they might have on political issues, by giving them a more objective measure of the ways in which they might be biased."
How does this bias test work?
When thinking about political questions, two different types of issues are relevant. We first need to consider empirical questions about the real-world consequences of a given policy — does the death penalty tend to reduce homicide rates, for example?
But merely knowing the answer to these empirical questions isn’t enough. We also have to make value judgments, judgments about the way we want the world to be. I might believe, for example, that the death penalty does reduce homicide rates but still be against death penalty overall, because I believe it’s wrong for the state to kill citizens regardless of the consequences. Or I might believe the death penalty doesn’t reduce homicide rates but that it should still be used, because the most horrific crimes always deserve to be punished by death.
In theory, the value judgments that each of us holds shouldn't affect what we believe about empirical issues. If you believe it’s always wrong to restrict an individual’s freedom, that belief is entirely independent of whether gun control does in fact reduce crime rates. But in practice, it’s difficult for most of us to keep these two things separate. Our ideologies and values tend to influence what we believe about purely empirical issues. If I believe it’s wrong to kill criminals, then I’m more likely to think the death penalty is ineffective as a deterrent, for example.
So Schubert’s test estimates your degree of bias by testing your empirical knowledge of political issues, and seeing how often your views on purely factual matters align with or contradict your values. People who are biased are more likely to answer correctly on questions where the correct answer supports their ideology, but incorrectly when the correct answer conflicts with that ideology. By contrast, someone who is not politically biased is likely to be correct about as often about facts that support as conflict with their viewpoint.
How good is this as an approach to measuring bias?
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/3988778/shutterstock_190190132.jpg)
This test is based on the assumption that the best explanation for a correlation between someone’s factual answers and their political views is that their political views have tainted their evaluation of those factual issues. But in principle, it could be that they acquired their factual beliefs independently of any political ideology — and later chose the ideology that best fit their views.
One reason to be skeptical of this alternative explanation comes from a recent study that found test takers' factual answers and political views were less aligned when people were paid for getting correct answers. If your factual beliefs were acquired independently of your political views, then it’s not clear why you would change those beliefs simply because you were getting paid. But this is precisely the behavior we would expect from someone whose original answers were influenced by their values or ideology — the payment makes people think about the questions more thoroughly, rather than just going with their political gut feeling.
In any case, Dr. Schubert argues that political bias, in one form or another, is part of both of these explanations.
Why are so many people biased about politics?
At the end of the test, Schubert also provides an explanation of why political bias occurs. A number of well-known psychological mechanisms help account for it, including confirmation bias — the tendency to favor evidence and arguments that support our existing viewpoints — and wishful thinking, the bias toward believing what we would like to be true.
But the question remains of why we are particularly prone to confirmation bias and wishful thinking around political issues. I think there are three key factors.
1) Political issues are incredibly complex
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/3934490/variation.png)
Political issues are incredibly complicated — even when we only focus on factual questions about how policies affect the world, before we consider the fact that people have different values. Without a great deal of research and evidence, it’s very difficult for most people to begin to evaluate the question, "What effect does raising the minimum wage have on unemployment rates?" Even with a great deal of research and evidence, it’s still hard to answer this question, because the evidence is messy and conflicting. This meta-analysis of 64 studies discovers no effect of minimum wage on unemployment after correcting for publication bias, while this systematic review of 100 or so studies finds strong and consistent effects of the minimum wage raising unemployment rates.
Given this, it seems impossible for the average person to have any kind of opinion on the minimum wage without spending a huge amount of time trawling through the relevant evidence. And yet many people do have opinions on how the minimum wage affects unemployment rates. How is this possible? When faced with an impossibly difficult question, we tend to substitute it for an easier one — in this case, perhaps, "What do people with similar values to me believe?" or, "What does the party I support believe about this issue?" These kinds of shortcuts for answering political questions can often be useful and save time, but may also sometimes lead to systematic biases, since we’re not actually answering quite the question we think we are.
2) Politics is tied up in our social groups and identities
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/3988692/151392924.jpg)
Why do we feel the need to have an opinion on the minimum wage at all — why not simply say, "I don’t know?" Some people do say this, of course. But because politics is inherently social, it tends to become part of people’s identities. People often organize themselves into social groups based on political ideologies, and define themselves partly in terms of those ideologies.
This provides people with a strong incentive to maintain and defend those ideologies. Doing so reinforces their sense of identity or group membership. There seem to be a deep-rooted human drive to form and maintain a clear self-identity, as well as to feel like we belong to a certain group. When factual issues come up that might potentially threaten these ideologies, then, we have a lot of motivation to take the side that fits with our identities and social groups.
3) We have very little incentive to form accurate political beliefs
While we have strong social incentives to defend our groups' political beliefs, we have very little incentive to form accurate political beliefs. Most people acknowledge that political issues are incredibly important for society as a whole, but as individuals we’re not necessarily rewarded for how rational and truth-seeking we are about politics.
Consider the contrast between our beliefs about politics and our beliefs about the physical environment immediately around us. If I believe that the pavement ahead of me is clear and I’m wrong — there’s actually a lamppost in my path, say — I’m going to quickly suffer as a result of my false belief. This gives me a clear incentive to form accurate beliefs about the physical environment immediately around me.
The same logic doesn’t apply in politics. If I believe that immigration to my country from other countries is harmful, that false belief doesn’t harm me directly. Even if I vote in an election based on that belief, the chances of my belief actually affecting the outcome of the election are so slim that a poorly chosen vote is unlikely to hurt me personally.
Why is political bias such an important issue?
Political bias is, as philosopher Michael Huemer puts it, "a problem that prevents us from solving other problems." In order to solve some of the biggest world problems — global poverty, crime, climate change — we need to be able to figure out the truth on a number of complex empirical issues, like how gun control rates affect crime, or which developing-world interventions actually make people’s lives better. We also need to be able to communicate effectively with one another, especially people with whom we disagree, to share knowledge and cooperate on projects. The blinkered way we think about politics seriously threatens our ability to do both of these things, leading to closed-mindedness, polarized views, and conflict.
There are two ways we might try to tackle the problem of political irrationality. We might take a "top down" approach, designing better institutions that provide people with incentives to be truth-seeking about politics and make it easier to disseminate important political information. Or we might take a "bottom up" approach: Each of us, as individuals, could try to find ways to combat our own biases and be more open-minded. Ultimately, progress on this issue will almost certainly require a bit of both. In the long run, better institutions might provide a more lasting solution, but in the meantime it’s on us as individuals to take responsibility and try harder to ground our political views in evidence.
The first step is simply acknowledging the possibility that you might not be as objective as you’d like to think.