clock menu more-arrow no yes mobile

Cognitive science suggests Trump makes us more accepting of the morally outrageous

Muslims and Allies march from the Department of Justice to the White House to ask President Obama to rescind NSEERSn for the #NoMuslimRegistry Campaign on December 12, 2016 in Washington, DC.
In December, marchers in Washington, DC, protested Donald Trump idea for a registry of Muslim citizens — one of numerous proposals by the President-elect that once seemed unthinkable.
Larry French / Getty

In October 2015, a college student named Lauren Batchelder stood up at a campaign forum in Manchester, New Hampshire, and challenged Donald Trump's record on women's issues. “Maybe I’m wrong,” she said, “maybe you can prove me wrong, but I don’t think you’re a friend to women.”

So far, this might sound like a perfectly ordinary case of campaign confrontation, but things soon took a more surprising turn. Trump attacked Batchelder personally on Twitter, describing her as "nasty" and an "arrogant young woman," and even suggesting she was secretly working for Jeb Bush's campaign. (She wasn’t.) Soon, Batchelder was receiving online rape threats from Trump supporters. Five days before the election, she got a Facebook message from someone threatening to "stomp your head on the curb and urinate in your bloodied mouth.”

Incidents like this one have provoked a distinctive form of condemnation. Commentators have not simply described Trump's behavior as bad or wrong; they have insisted it is not normal. More strikingly, they have suggested it is deeply important that such behavior not be “normalized.” The suggestion seems to be that there is something of value, something worth fighting to preserve, in our shared understanding that certain conduct is not a normal part of the way American politics works.

But what does that mean, exactly? Why is it so very important to hold on to our understanding that these sorts of things are abnormal? What would be lost if we stopped seeing them as abnormal and started seeing them, instead, as simply “bad” or “wrong”?

To get a better understanding of these issues, we can turn to research in cognitive science. Recent studies have taught us a lot about what happens when people classify events as normal or abnormal. The findings have real potential to help us understand what would actually occur if behavior like Trump's were to be gradually normalized.

Our minds use the normal-abnormal distinction to rule out many options in advance

At the core of this research is a very simple idea: When people are reasoning, they tend to think only about a relatively narrow range of possibilities. You are sitting there in a restaurant, trying to decide what to order. Almost immediately, you determine that you are going to get either the chocolate cake or the cheese plate. You then start to consider the merits and drawbacks of each option. "Should I get the chocolate cake? Nah, too many carbs. Better get the cheese plate." One important question about human cognition is how people end up choosing one option over the other in a case like this.

But there is another question here that is even more fundamental — so fundamental that it’s easy to overlook. How did you pick out those two options in the first place? After all, there’s an enormous range of other options that would, at least in principle, have been possible. You could have stormed into the kitchen and started eating directly out of the chef's saucepan. You could have reached under the table and started trying to eat your own shoe. Yet somehow you manage to reject all of these possibilities before the reasoning process even begins. It’s not as though you think, "Should I try to eat my shoe? No, it’s not very tasty, or even edible." Rather, possibilities like this one never even enter your reasoning at all.

This is where the notion of normality plays its most essential role. Of all the zillions of things that might be possible in principle, your mind is able to zero in on just a few specific possibilities, completely ignoring all the others. One aim of recent research has been to figure out how people do this. Though the research itself has been quite complex, the key conclusion is surprisingly straightforward: People show an impressive systematic tendency to completely ignore the possibilities they see as abnormal.

We make use of the normal-abnormal distinction when thinking about causality

Although researchers have developed numerous methods for studying the way people think about possibilities, one of the most popular is just to look in detail at the way they use the word “cause.” This sort of research might at first seem a bit far removed from any question of real importance, but it actually serves as a valuable indirect method for figuring out which possibilities people consider and which they ignore.

People consider the range of causes for an occurrence much as they weigh the dining options at a restaurant: with a built-in selectivity. For example, suppose a person lights a match and carelessly drops it on the ground, and a forest fire begins. Now take the sentence, “The lit match caused the forest fire.” If you are like most people, you will think this sentence sounds about right (even if it doesn’t tell the whole story).

Of course, the sentence does not itself mention any alternative possibilities, but most researchers think it gives us an important clue about which possibilities people are considering. In particular, it suggests that people are thinking something along the lines of: If the person hadn't carelessly dropped the lit match, the forest fire would not have occurred.

Okay, now consider a different case. Start with the same story. Person drops match, forest fire starts. But this time, take a different sentence: “The presence of oxygen in the atmosphere caused the fire.” If you are like most people, you will think this sentence sounds very wrong. But why? On some level, the two cases are completely parallel. After all, you would surely agree on reflection that If there hadn't been any oxygen in the atmosphere, the forest fire would not have occurred.

Yet there is a deeper respect in which this second case is completely different. The difference is that most people would never consider possibilities in which there is literally no oxygen in the atmosphere. These possibilities seem outlandish, preposterous, not even worth entertaining.

In short, just by looking at the fact that people agree with certain sentences, we can get valuable clues about which possibilities they are considering. There have now been a whole slew of different studies using this method, and although some controversy remains about how to interpret the results, the overall pattern seems to be pointing toward an important and very general conclusion: People appear to have a systematic tendency to focus on the possibilities they see as normal and to ignore the ones they see as abnormal.

So, for example, a situation in which there is no oxygen in the atmosphere is seen as “abnormal,” and for the sake of cognitive efficiency, the mind gives it zero consideration. (My co-authors, Thomas Icard and Jonathan Kominsky, and I review some of the experimental evidence for this claim in a recent technical paper.)

Once an option is recategorized as “normal,” people are more likely to choose it

There is a corollary to this finding: If people’s ideas about what is normal and abnormal change, that can cause changes in the possibilities they consider — and even the actions they take.

For an especially striking example, consider a real-world problem that arose in Arizona’s Petrified Forest National Park. Tourists were stealing bark from the trees, and the park as a whole was gradually being destroyed. What could be done to stop this theft? The staff of the park decided in the end to put up a sign: ‘‘Your heritage is being vandalized every day by theft losses of petrified wood of 14 tons a year, mostly a small piece at a time.’’ The goal was to raise awareness of the problem, making people see more clearly what was so bad about stealing from the park.

Perhaps the sign did succeed in raising awareness, but it also had another, more surprising effect. By drawing attention to the fact that people often steal, it made people see theft as normal. Many of the park visitors might have seen theft as something that wasn’t even worth considering (like trying to eat your shoe), but the sign helped to switch them over to seeing it as something that might be bad but was still among the normal options (like eating chocolate cake). A systematic study examined the impact of this sign. The key result: Putting up this sign actually led to an increase in the total amount of theft.

Trump’s rhetoric may be shifting the boundaries of what the American polity will consider

This framework now makes it possible to understand the difference between seeing Trump's behavior as bad and seeing it as abnormal. When we see something as bad, we feel there are specific reasons not to move forward with it. This is the attitude that liberals typically take toward tax cuts. They think people should think critically about fiscal policy, see what is bad or wrong about tax cuts, and then fight to resist them.

But this does not seem to be an appropriate response to the sorts of things Trump has been doing. When a candidate faces a challenge from a college student, we do not want the candidate to be thinking: "Should I start tweeting out insults about her? No, that would be bad because..." On the contrary, if we get to the point where candidates are thinking about whether behavior like this would be good or bad, things have already gone too far. This is the sort of possibility that should be ruled out before the process of considering different options has even begun.

And it’s not just a matter of a few inappropriate tweets. Once-unthinkable policies, such as new laws to constrain the press, or a federal registry of Muslims, are now being placed in the category of the “thinkable.” Of course, many people still believe these policies are deeply wrong, but all the same, it can hardly be denied that people are considering them. These are policies that would at one time have been regarded as completely outside the sphere of possibility.

It has become something of a cliché to blame the media for these developments. The usual suggestion is that if only the press had been more strident in its condemnation, Trump’s behavior could never have been fully “normalized.” This cliché gets everything wrong. The sign in the park included a vigorous denunciation of theft, but it nonetheless served to normalize the very behavior it was denouncing. Likewise, no matter how frequently and loudly we insist that what Trump is doing is wrong, we normalize his behavior just by letting people know about it.

The park found a simple solution to its problem. It removed the sign and thereby stopped informing people about the prevalence of theft. The trouble is that there is no hope at all of adopting an analogous solution in the case of Trump’s behavior. Trump is our president-elect, and there is no real way we can refrain from informing people about the things he does. Whatever else we might decide to do, we can’t just agree to stop talking about him.

So then, what is to be done? I wish I could say that cognitive scientists have settled on a different but equally effective solution and that all we need to do now is go out and implement it. Unfortunately, however, that is not the case. Research in cognitive science has done a lot to give us a deeper understanding of the problem we now face, but it has not yet furnished us with a workable way of addressing it.

Joshua Knobe is a professor of cognitive science and philosophy at Yale University. He is a co-editor of the book Experimental Philosophy.


The Big Idea is Vox’s home for smart, often scholarly excursions into the most important issues and ideas in politics, science, and culture — typically written by outside contributors. If you have an idea for a piece, pitch us at thebigidea@vox.com.