Last year, UCLA grad student Michael LaCour and Columbia political scientist Donald Green published a startling finding, based on a experiment they ran: Going door to door to try to persuade voters to support same-sex marriage works, they found, and it works especially well when the canvasser delivering the message is gay. They even found spillover effects: People who lived with voters who talked to a gay canvasser grew more supportive of same-sex marriage, too.
This was a really exciting conclusion, for political scientists and laypeople alike. Past research has suggested that people's political views are tribal and largely impervious to rational persuasion. Dartmouth political scientist Brendan Nyhan and the University of Exeter's Jason Reifler have conducted multiple studies that show correcting people's incorrect views about, say, the presence of WMDs in Iraq can actually backfire and make them hold their wrong beliefs even more firmly. LaCour and Green's study stood in stark contrast to this literature, suggesting that rational persuasion is actually possible. "It seemed like they'd invented something new," Ira Glass said in a This American Life segment highlighting the study, "a new tool to use to change people's opinions." In my write-up on April 8 of last year, I concluded that "what LaCour and Green found here is kind of miraculous."
Discovering the fraud
The fraud was uncovered when two UC Berkeley grad students, David Broockman and Josh Kalla, attempted to mount an extension of the study. As they inspected its details, two red flags popped up. Response rates to the surveys cited in the study were higher than expected, and people's answers across surveys were much more consistent than is normally the case. When they launched a pilot version of their study, Broockman and Kalla's fears were confirmed: Their response rates were notably lower than LaCour's and Green's. Thinking that maybe LaCour and Green's survey firm was just unusually good, Broockman and Kalla contacted it and asked to speak to the staffer said to be responsible. The firm said it had never heard of the project, never had an employee by that name, and didn't even have the capabilities to carry out a study along the lines of the one LaCour and Green described.
Eventually, after contacting Green and enlisting the help of Yale professor Peter Aronow, Broockman and Kalla tallied up eight irregularities in the LaCour and Green study. The simplest explanation for their findings was that LaCour took a preexisting survey, sprinkled some statistical noise on it, and passed it off as the findings of a canvassing experiment. According to the retraction letter Green sent to Science, Green then contacted LaCour's adviser at UCLA, Lynn Vavreck, who discovered that the study's raw data could not be traced to Qualtrics, the survey platform LaCour claimed to have used. LaCour told Vavreck he'd deleted the source files by accident, but a Qualtrics representative found no evidence that happened. Vavreck asked LaCour for contact information for the survey respondents, to verify their participation, but he refused. He also confirmed that, despite what he wrote in the study text, he'd offered and paid no cash incentives to respondents, and hadn't accepted or used any grant money to conduct the surveys.
How I fell for it
LaCour was set to become an assistant professor at Princeton this July. Any reference to that job has been removed from his homepage. But the page still features a long list of media outlets that have covered his research. Just about every place you can think of covered the same-sex marriage study: This American Life, the New York Times, the Wall Street Journal, the Washington Post, the Los Angeles Times, Science Friday, Bloomberg Politics, Huffington Post, and, of course, me at Vox. We all got it wrong.
Personally, I believed the study was sound because it came from sources I trust. Ironically, Broockman — who'll start as an assistant professor at the Stanford Graduate School of Business this fall — was the one who first alerted me to the study he'd wind up exposing as a fraud. David's an old friend and often passes along papers he thinks I ought to cover. Here's what he said on LaCour and Green: "Deep. Compelling. Awesome … The most important paper of the year. No doubt."
I also trusted Green, who's helped pioneer experimental political science and has, in particular, done considerable research on canvassing as a tool in political campaigns, finding in multiple experiments that it's the most consistently effective method of getting out the vote. Past work by Donald Green showing that canvassing works has held up over the years, so I figured that his latest study showing canvassing works ought to, as well. At the very least, I thought the data was legit. Apparently so did Green; he was duped too.
Social science, and science in general, is, to quote Vox's Julia Belluz and Steve Hoffman, "a long and grinding process carried out by fallible humans, involving false starts, dead ends, and, along the way, incorrect and unimportant studies that only grope at the truth, slowly and incrementally." Outright fraud like LaCour's is rarer and more shocking than findings that don't hold up to replication or simple data entry errors. But it should serve as a reminder — both to people like me who report on studies and to the readers we've trying to serve — that pinning your hopes on one study is a foolish proposition.