Skip to main content

Believe that journalism can make a difference

If you believe in the work we do at Vox, please support us by becoming a member. Our mission has never been more urgent. But our work isn’t easy. It requires resources, dedication, and independence. And that’s where you come in.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Support Vox

Can you guess which government programs work? Most people can’t.

Scared Straight does not work. Ask Kenan.
Scared Straight does not work. Ask Kenan.
Scared Straight does not work. Ask Kenan.
Dana Edeleson/NBC/NBCU Photo Bank via Getty Images

If you’ve taken a health class at an American high school, odds are pretty good that you’ve encountered the Scared Straight program. Scared Straight takes kids who have committed misdemeanors to visit prison and meet convicted criminals and confront their likely future, if they don’t change their ways. The concept proved popular not just as a social program but as entertainment; it was adapted for both an acclaimed documentary and a TV show on A&E, which broke ratings records for the network upon its premiere and is concluding this summer after nine seasons.

There’s just one problem with Scared Straight: It has been proven to cause young people to commit more crimes. The effect is so significant that the Washington State Institute for Public Policy estimated that each $1 spent on Scared Straight programs causes more than $200 worth of social harm. It turns out that spending time with convicted criminals, even if they warn you to stay on the straight and narrow, makes you more inclined to commit crimes yourself.

Think you would have guessed ahead of time that that was the case?

We bet you couldn’t.

How do we know? At our nonprofit 80,000 Hours we ran an experiment, and only 15 percent of participants correctly guessed the program would increase criminality. You can try the experiment yourself here.

We collected 10 social programs that had been thoroughly tested by researchers conducting multiple trials to determine their impact. The interventions were taken from those reviewed by the Campbell Collaboration, which brings together all the highest-quality research that’s available on major social interventions to decide whether they’re effective. We chose the top 10 interventions that were easiest to explain and had the clearest conclusions. We also checked with an early group of participants to make sure they had no trouble understanding the content.

We then asked more than 100 subjects to guess which of these 10 interventions we described would help, which would have no effect, and which would do harm. If they couldn’t guess any better than a chimpanzee choosing at random, we would expect them to get three or four out of 10 correct on average (because there are three options). In reality, people got four out of 10 — just a tiny bit better than chance. One person managed eight out of 10, and nobody did better than that.

We actually had a check in the experiment to filter out people who were just choosing answers without even reading the project descriptions, and they did slightly better than those who did read the descriptions.

As many of these are famous trials, the results of which were widely publicized, the small edge participants had on chance may just be because they already knew the result for one of them, or looked up the answer.

What can we learn from this? Sadly, it isn’t possible for the public to know ahead of time whether a nice-sounding idea will actually help people or hurt them. Whether it’s a politician proposing a new social program for young people or a charity fundraiser describing how they are going to help the homeless, neither your head nor your gut can consistently tell you if their approach is going to work. A lot of things that sound good don’t do good, and vice versa.

Instead, you have to get experimental evidence. What trials have been run? How did the people who didn’t get the program compare with those who did? Were they comparable groups? What do experts who conduct reviews of the field’s research conclude?

Think you can do better than chance? You can take the test yourself here.

Benjamin Todd and William MacAskill are co-founders of 80,000 Hours; MacAskill is also the co-founder of Giving What We Can, an associate professor of philosophy at Oxford, and author of Doing Good Better. Robert Wiblin is the executive director of the Centre for Effective Altruism, which houses both 80,000 Hours and Giving What We Can.

See More:

More in Politics

What if everyone qualified for welfare benefits?What if everyone qualified for welfare benefits?
Policy

Universal programs are much easier to administer than means-tested ones.

By Abdallah Fayyad
Trump and Musk actually made a good point on immigrationTrump and Musk actually made a good point on immigration
Politics

The US does need more skilled workers. But that’s not all it needs.

By Eric Levitz
The past 24 hours in South Korea’s chaotic politics, explainedThe past 24 hours in South Korea’s chaotic politics, explained
World Politics

South Korea’s impeached president avoided arrest — for now.

By Ellen Ioanes
House Republicans’ speaker drama, briefly explainedHouse Republicans’ speaker drama, briefly explained
Politics

Initial GOP defections point to the party’s divides.

By Li Zhou
President Biden blocked the sale of US Steel. Why?President Biden blocked the sale of US Steel. Why?
Vox’s guide to Donald Trump’s 2024 policies

How a consolation prize for unions might screw everyone over — them included.

By Dylan Matthews
The New Orleans attack shows that ISIS hasn’t gone away. It’s changed.The New Orleans attack shows that ISIS hasn’t gone away. It’s changed.
World Politics

The “caliphate” is history. But ISIS is still a threat.

By Joshua Keating