clock menu more-arrow no yes mobile

Filed under:

Alex Jones and the illusory truth effect, explained

One simple trick for making bullshit seem real: repetition.

(Alex Jones/InfoWars)
Brian Resnick is Vox’s science and health editor, and is the co-creator of Unexplainable, Vox's podcast about unanswered questions in science. Previously, Brian was a reporter at Vox and at National Journal.

When NBC previewed what was billed as a Megyn Kelly primetime interview with conspiracy theorist Alex Jones on June 11, we were worried. The preview suggested a confrontational sit-down interview that would give Jones a huge platform to spew his cruel lies. It seemed likely that any attempt Kelly would make to fact-check him during the interview would not make an impression.

The segment ultimately didn’t live up to our fears. After a massive public outcry, the piece underwent a hasty re-edit, and ultimately aired as a reported profile of Jones, rather than a one-on-one interview. Kelly introduced each of Jones’s pet conspiracy theories — that the Sandy Hook massacre was a hoax, that Barack Obama wasn’t born in the US, that the yogurt company Chobani supports “migrant rapists,” and so on — by first telling viewers they were lies.

Kelly said Jones has a pattern of making “reckless accusations, followed by equivocations and excuses.” Kelly didn’t equivocate: Jones’s ideas are “dangerous” she said, flat out.

Kelly was right to clarify this. Psychological science consistently finds when a lie gets repeated, it’s slightly more likely to be misremembered as truth. It’s called the “illusory truth effect.” It’s a tendency the whole news media — as well as consumers of news — should be wary of. And it’s a reason not to give notorious bullshitters such a substantial spotlight. Especially bullshitters whose lies hurt others and whose lies have a track record for virality.

If you watch cable news, you’ll see the pattern: An anchor will ask a politician or public figure a question. That politician will dodge the question with a dubious reply. And the anchor may not have the knowledge at the top of his head to fact-check on the spot.

We’re exposed to lies all the time. And those lies are seeping in to our minds, mixing up our memories of what’s true, and what’s false.

Repeating a lie makes it seem more true, explained

The illusory truth effect has been studied for decades — the first citations date back to the 1970s. Typically, experimenters in these studies ask participants to rate a series of trivia statements as true or false. Hours, weeks, or even months later, the experimenters bring the participants back again for a quiz.

On that second visit, some of the statements are new, some are repeats. And it’s here that the effect shows itself: Participants are reliably more likely to rate statements they’ve seen before as being true — regardless as to whether they are or not.

When you’re hearing something for the second or third time, your brain becomes faster to respond to it. “And your brain misattributes that fluency as a signal for it being true,” says Lisa Fazio, a psychologist who studies learning and memory at Vanderbilt University. The more you hear something, the more “you’ll have this gut-level feeling that maybe it’s true.”

Most of the time this mental heuristic — a thinking shortcut — helps us. We don’t need to wrack our brains every time we hear “the Earth is round” to decide it’s true or not. Most of the things we hear repeated over and over again are, indeed, true.

But falsehoods can hijack this mental tic as well.

And it can happen whether or not we have some prior knowledge about a subject. In 2015, Fazio and co-authors published a paper that found that prior knowledge about a topic doesn’t inoculate you to the effect. In her study, participants who knew facts like “kilts are the skirts that Scottish men wear” became more doubtful if they read “saris are the skirts Scottish men wear.” And they became even more doubtful if they read “saris are the skirts Scottish men wear” for a second time (participants rated the truthfulness of the statements on a 1-to-6 scale.)

She stresses it’s not that people completely change their understanding of Scottish fashion customs by reading one sentence. But that doubt begins to creep in. “They moved from ‘definitely false’ to ‘probably false,’” she says. Every time a lie is repeated, it appears slightly more plausible to some people.

The more we encounter fake news, the more likely we are to believe it

A lot of this research has dealt with statements of trivial importance.

And for long time, psychologists assumed that flat-out outlandish headlines — like “Pope Francis Shocks World, Endorses Donald Trump for President” — wouldn’t produce this effect. But recent research shows the illusory truth effect is indeed at play with fake news.

In 2012, a small-scale paper in Europe’s journal of Psychology found that “exposure to false news stories increased the perceived plausibility and truthfulness of those stories.” The study had participants read made-up (but not totally outlandish) news stories — like one on a California bill to limit the number of credit cards a in-debt person could own. Five weeks later, they were more likely to rate these false stories as being truthful as compared to a group of participants who had never seen those stories before.

More recently, Yale psychologist Gord Pennycook — who specializes in the study of decision making and how we interpret bullshit claims tested the illusory truth effect with fake-news headlines ripped from the 2016 presidential campaign.

Pennycook and colleagues ran several versions of a classic “illusory truth” study design with around 2,000 participants. In one arm of the study, participants were shown six real and six fake-news headlines — and were asked about how accurate they were. Then, after a while, the participants were given a list of 24 headlines to evaluate, which included all of the fake news stories they saw earlier.

Pennycook was able to replicate the classic finding: When participants had been exposed to a fake news headline previously, they were more likely to accept it as truth later on.

“We found essentially the same effect, which was surprising because the stories that we’re using are really quite implausible, like ‘Mike Pence’s marriage was saved by gay conversion therapy.’” Pennycook says. The effects was not limited to Republicans or Democrats in the study’s large sample. And a followup test revealed the effect persisted a week later.

Participants were routinely more likely to rate a headline as being true if they had been exposed to it in the past.

Again, it’s not that everyone is being completely duped. One of the fake-news headlines used in the study was “Trump to Ban All TV Shows that Promote Gay Activity Starting with Empire as President.” If a group of participants hadn’t seen it before, about 5 percent said it was accurate. If the group of participants had seen it before in an earlier stage of the experiment, around 10 percent said it was accurate. That’s twice as many people agreeing an outlandish headline is truthful.

(I should mention: Pennycook’s work has only been published in preprint form, which means it has yet been through peer review. So treat these findings as preliminary. His team did pre-register the study design, which is one safeguard in ensuring objective results.)

Our memories are very prone to mixing up real and false information

The research here suggests that even when there are fact checks around bullshit claims, the illusory truth effect still influences our memories to confuse fact and fiction.

It’s because our memories aren’t so great. Recently I had a conversation with Roddy Roediger, one of the nation’s foremost experts on learning and memory. In his experiments, he shows how even small suggestions from others can push us to remember whole scenes and experiences differently. And we tend to sloppily remember events like news reports.

“When you see a news report that repeats the misinformation and then tries to correct it, you might have people remembering the misinformation because it's really surprising and interesting, and not remembering the correction,” Roediger, a psychologist at Washington University in Saint Louis, said. (And for Jones fans who may be watching the interview, Kelly’s corrections were very unlikely to have an impact on their views. A lot of psychological research finds that corrections often backfire, leaving people more dead-set in the prior held beliefs.)

In one arm of his experiment, Pennycook even put a warning around the fake news headlines when participants first read them. “Disputed by 3rd Party Fact-Checkers,” the note read (which is Facebook’s exact wording for how they’re labeling dubious stories.) The warning made no difference.

“We basically said, ‘This is something you shouldn’t believe,’” he says. But participants later on still rated those headlines as being more accurate than ones they had never seen before.

Though have some faith: Pennycook found truly, truly outrageous statements like “the Earth is a square” didn’t gain acceptance with repetition.

This is not just a Megyn Kelly problem: cable news is constantly repeating lies

So back to the Alex Jones interview.

It’s not that journalists should never interview liars or try to put them in their place. It’s the some of the greatest value we can provide. It’s a problem when the lies are particularly damaging. People who heard from Jones that Sandy Hook was a hoax have bullied and tormented parents of dead children. Parents opting out of vaccinating their children are partly to blame for the recent outbreaks of measles and whooping cough in the US.

Kelly’s production team did a decent job minimizing Jones’s ideas, and clearly labeled them as false. But should we risk the chance of making any of Jones’s lies just a smidgen more acceptable? Pennycook leans no.

“It’s fine to bring people on TV who might be wrong about some things,” Pennycook says. “But there’s no point in debating somebody who doesn’t care whether or not they are correct. I think we should bring people on TV that actually have some sort of regard for the truth.”

Television and print news is never going to be free of all falsehoods. But who we decide to put on TV matters. Their ideas — whether we fact check them or not — make an impression.

Sign up for the newsletter Sign up for Vox Recommends

Get curated picks of the best Vox journalism to read, watch, and listen to every week, from our editors.