If you've taken a health class at an American high school, odds are pretty good that you've encountered the Scared Straight program. Scared Straight takes kids who have committed misdemeanors to visit prison and meet convicted criminals and confront their likely future, if they don’t change their ways. The concept proved popular not just as a social program but as entertainment; it was adapted for both an acclaimed documentary and a TV show on A&E, which broke ratings records for the network upon its premiere and is concluding this summer after nine seasons.
There’s just one problem with Scared Straight: It has been proven to cause young people to commit more crimes. The effect is so significant that the Washington State Institute for Public Policy estimated that each $1 spent on Scared Straight programs causes more than $200 worth of social harm. It turns out that spending time with convicted criminals, even if they warn you to stay on the straight and narrow, makes you more inclined to commit crimes yourself.
Think you would have guessed ahead of time that that was the case?
We bet you couldn’t.
How do we know? At our nonprofit 80,000 Hours we ran an experiment, and only 15 percent of participants correctly guessed the program would increase criminality. You can try the experiment yourself here.
We collected 10 social programs that had been thoroughly tested by researchers conducting multiple trials to determine their impact. The interventions were taken from those reviewed by the Campbell Collaboration, which brings together all the highest-quality research that's available on major social interventions to decide whether they're effective. We chose the top 10 interventions that were easiest to explain and had the clearest conclusions. We also checked with an early group of participants to make sure they had no trouble understanding the content.
We then asked more than 100 subjects to guess which of these 10 interventions we described would help, which would have no effect, and which would do harm. If they couldn’t guess any better than a chimpanzee choosing at random, we would expect them to get three or four out of 10 correct on average (because there are three options). In reality, people got four out of 10 — just a tiny bit better than chance. One person managed eight out of 10, and nobody did better than that.
We actually had a check in the experiment to filter out people who were just choosing answers without even reading the project descriptions, and they did slightly better than those who did read the descriptions.
As many of these are famous trials, the results of which were widely publicized, the small edge participants had on chance may just be because they already knew the result for one of them, or looked up the answer.
What can we learn from this? Sadly, it isn’t possible for the public to know ahead of time whether a nice-sounding idea will actually help people or hurt them. Whether it’s a politician proposing a new social program for young people or a charity fundraiser describing how they are going to help the homeless, neither your head nor your gut can consistently tell you if their approach is going to work. A lot of things that sound good don’t do good, and vice versa.
Instead, you have to get experimental evidence. What trials have been run? How did the people who didn’t get the program compare with those who did? Were they comparable groups? What do experts who conduct reviews of the field’s research conclude?
Think you can do better than chance? You can take the test yourself here.
Benjamin Todd and William MacAskill are co-founders of 80,000 Hours; MacAskill is also the co-founder of Giving What We Can, an associate professor of philosophy at Oxford, and author of Doing Good Better. Robert Wiblin is the executive director of the Centre for Effective Altruism, which houses both 80,000 Hours and Giving What We Can.