clock menu more-arrow no yes mobile

Filed under:

Donald Trump and the slippery slope to becoming a prolific liar

How do politicians get so comfortable with lying? One theory: practice.

Brian Resnick is Vox’s science and health editor, and is the co-creator of Unexplainable, Vox's podcast about unanswered questions in science. Previously, Brian was a reporter at Vox and at National Journal.

How does a person get so comfortable with telling lies?

It’s a question we have reason to ask this week.

There are big, serious, scary questions over whether President Donald Trump is lying about the rationale behind firing FBI Director James Comey. As Vox’s Matt Yglesias explains, Trump has a history of lying, often. He’s lied about his own questioning of President Barack Obama’s birthplace. He’s lied about tweeting that climate change is a hoax planted by China. He’s lied about how he’s voiced support for the Iraq War. And if he is lying, it means a great many more people — including Attorney General Jeff Sessions — are lying on his behalf.

By age 5, almost all of us have learned how to lie. But some of us grow up to be prolific liars, while others are honest to a fault. Research psychologists have come around to a simple hypothesis that helps explain the difference: To become a prolific liar, it takes practice.

We learn to lie on a “slippery slope”

Before we dive in, let’s be clear: Psychological research makes predictions about the behavior of groups; it cannot necessarily explain the behavior of an individual. So we can’t say for sure what the source of Trump’s — or any other politicians’ — lies are.

But researchers do have some new insight into how a person might grow more comfortable lying over time.

The hypothesis is that we gradually become more comfortable with lying (and other forms of immoral behavior) with minor acts that build up over time. Researchers call this the “slippery slope” model. And there’s some good evidence for it.

A 2015 paper in the Journal of Applied Psychology had participants play many rounds of a Sudoku-like game. Correct answers in each round yielded a higher and higher cash reward, and the design of the study allowed for participants to lie about their scores.

In one arm of this experiment, the cash rewards increased very gradually, around 75 cents per round. In another arm, the cash reward jumped abruptly and dramatically to $2.50. Participants were more likely to lie about their scores in the gradual change arm. Which shows we gradually habituate ourselves to lying, and become comfortable with it by little steps, not huge ones. “Exposure to slippery-slope conditions more than doubled the rates of unethical behavior in our studies,” the authors concluded.

These gradual changes, they hypothesized, allow us to slowly become disengaged with our sense of morality. It’s a type of self-deception, they explain, where we slowly convince ourselves the immoral behavior isn’t all that bad.

As we learn to lie, we undergo an “emotional adaptation” that makes us feel less bad about the lying.

Another paper, published last year in Nature Neuroscience, described this experience as an “emotional adaptation.” It’s similar to what happens when you’re exposed to a strong smell. At first the smell is extremely noticeable, but eventually you stop noticing it as much. With time, any stimulus — a loud noise, a strong perfume, etc. — is likely to provoke a smaller response. The same goes with lying.

We get desensitized to our own lying as the areas of our brain that correlate with negativity become less active. This makes it easier for us to lie in the future, the study concludes.

“The first time you cheat — let’s say you’re cheating on your taxes — you feel quite bad about it,” Tali Sharot, a University College London neuroscientist and one of the Nature study’s authors, said. But then the next time you cheat, you’re less likely to get that negative feeling. That makes it easier to lie again. And the cycle escalates from there.

This study was similar to the one in Applied Psychology: Participants were led to play a game where it would be really tempting to cheat.

The participants played the role of an adviser in a two-person game. They looked at 60 photos of glass jars with differing numbers of pennies, and were told to advise a partner (who was really a researcher in disguise) on how much money the jars contained. The participants were told they’d receive compensation based on the accuracy of their partner’s guesses.

In some of the trials, the participants were incentivized to be honest: If the partner guessed correctly, they’d both get the prize money. In other trials, the participants were incentivized to lie: If the partner overestimated, the participant would get more. (The study gave the participant the impression the partner had no idea about this arrangement.)

When the participants were incentivized to lie, they lied more as more trials were conducted.

“They started with small lies — let’s say lies of around £1 — but this grew, and they ended up with large lies, of around £8,” Neil Garrett, also a University College London neuroscientist and a co-author of the study, said.

The authors then took the study a step further to understand what this looked like in the brain. A small subset of the participants played this game while undergoing fMRI, a brain scanning technique. It appeared that the more the participants grew accustomed to lying, the less activation there was in the amygdala, a region of the brain associated with negative emotion.

“Arousal is one of the telltales of lying,” Sharot said. It can take the form of sweating and faster heart rate — what polygraph machines look for to detect lies. So if the brain is less aroused by lying, that might mean a person is getting used to it. “If arousal goes down, people may be less likely to catch you in a lie,” Sharot said.

Caveat: The subject pool for the fMRI section of the study was very small, only 25 participants. So these neuroimaging results would have to be replicated for a firmer conclusion. Also, the study design was not preregistered, which increasingly is seen as a safeguard against false-positive results. “We will need to wait for a replication of the fMRI results,” Sharot said. And fMRI results are notoriously hard to interpret: Read more about that in my earlier piece.

And there may be another way to interpret the results of the study: The participants are simply learning how to be liars. Oriel FeldmanHall, a neuroscientist who studies morality at Brown University and did not contribute to the Nature Neuroscience study, says the structure of the game may be what’s causing the lies to escalate, since there are no consequences for gaining more money through lying. “Rather than demonstrating a dishonest snowball effect, [the authors] may just be illustrating successful learning,” she writes me in an email. But then, people learn to lie in the absence of consequences in the real world too.

The social norms around lying can change — fast

In the 1960s, Stanford psychologist Albert Bandura showed how easy it is to teach kids to act violently — by showing them an adult acting violently.

In this famous experiment, Bandura showed young children — between 3 and 6 years old — a video of an adult wailing on an inflatable “bobo doll” (see in the video below). Other children in the study did not see an adult behaving aggressively to the doll.

And sure enough: The kids who saw the aggressive behavior were more aggressive themselves when playing with the doll later on. It’s a simple experiment with a simple conclusion: As humans, even at an early age — we learn what’s socially acceptable by watching other people.

So when prominent people change their behavior for the worse, and don’t suffer the consequences — the theory goes — acceptance of the bad behavior spread, and more people start to mimic it.

Here’s one example. In 2004, sociologists Thomas Ford and Mark Ferguson found that exposure to a racist or sexist joke increased tolerance of further discrimination in people who held prejudicial views. Hearing the off-color joke, they write, “Expands the bounds of appropriate conduct, creating a norm of tolerance of discrimination.

When prominent people — politicians perhaps — lie, and get away with it, acceptance of the practice spreads.

There’s some small evidence that societal-wide corruption trickles down into everyday lying. In 2015, researchers in the UK found across 23 countries that people are more likely to lie when they live in societies where corruption is rampant. “If politicians set bad examples by using fraudulent tactics like rigging elections, nepotism and embezzlement, then the honesty of citizens might suffer, because corruption is fostered in wider parts of society,” the study authors wrote in Nature.

Is there a way to stop people from lying?

It’s clear that some people are more prone to dishonesty than others — and are unlikely to change. Here’s one reason: Research suggests some people have a stronger physiological response to moral dilemmas than others. And extreme forms of lying, like compulsive lying, may be indicative of an underlying personality disorder.

But let’s assume politicians aren’t abnormal in this way, and that they are just normal people who are in an environment that rewards lying. Is there any way to keep them honest?

Sharot, the author of the Nature Neuroscience study, has a simple suggestion: “Perhaps we can nudge people away from dishonesty by calling them on their lies even if they are small, and try to reproduce an emotional reaction,” she says. In other words, reminding people they’re lying could help revive the negative feeling that may have been lost. Though this could backfire: People can become defensive when being called a liar.

Social norms play a big role, too, for ordinary people at least. David Rand, a Yale University psychologist, has found that when cooperation and truth telling are established upfront as the norm, people are more likely to play fair in the future.

Even politicians may listen to nudges to keep the fibbing to a minimum. In a small study, political scientists Brendan Nyhan and Jason Reifler found some evidence that down-ballot candidates who were sent letters reminding them "politicians who lie put their reputations and careers at risk, but only when those lies are exposed" were somewhat more truthful in their campaigns, as measured by newspaper fact-checks.

I asked Sharot if she thinks her work has any bearing on politicians. Can a long public life of small lies make you completely comfortable with lying?

“If someone has been repeatedly engaging in dishonest behavior, it is likely that that person has emotionally adapted to their own lying,” she says.

Further reading: political psychology

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.