clock menu more-arrow no yes mobile

Filed under:

The Economist ranked 1,275 colleges, and Yale is 1,270th

Libby Nelson is Vox's policy editor, leading coverage of how government action and inaction shape American life. Libby has more than a decade of policy journalism experience, including at Inside Higher Ed and Politico. She joined Vox in 2014.

The Economist is out with a new ranking of colleges that ranks Harvard fourth and Yale 1,270th.

The rankings are based on whether a college's alumni fare better than expected in the labor market, as measured by earnings 10 years after starting school. Using a range of variables — a college's location, its religious affiliation, its students' SAT scores, the socioeconomic and racial mix of its student body, whether it has a business school, and more — the Economist predicted what graduates would earn.

Then it looked at what students actually ended up earning, according to the Education Department's new College Scorecard, and ranked colleges according to how much they beat expectations.

The top five looked like this:

  1. Washington and Lee University (predicted income $55,223; median income $77,600)
  2. Babson College (predicted income $65,170; median income $85,500)
  3. Villanova University (predicted income $60,455; median income $73,700)
  4. Harvard University (predicted income $74,466; median income $87,200)
  5. Bentley University (predicted income $62,237; median income $74,900)

Yale, meanwhile, is all the way down at No. 1,270; its graduates make $9,000 less than the formula would have predicted.

It's certainly debatable whether these are actually the "best" colleges in the US. But the Economist rankings are both a great example of how college ranking works, and all the ways in which the ranking, and the data it's based on, is flawed.

The Economist's rankings are perfect examples of the form

Great college rankings, from the rankers' point of view, aren't about getting readers to agree. They're about getting readers' attention. Ideally, the ratings reflect the publication's values, or the values it imputes to its readers. And the Economist's rankings are a near-perfect example of the form.

Their method gives the aura of being scientific and precise. It's based on something attention-getting, money, that both the public at large and the magazine values. And its results are comprehensible, confirm some biases, and spark curiosity. Look, there's Harvard! Wait, what are Babson and Bentley, and what are they doing so high on the list?

But as with most college rankings, when you look at it for a second, it makes no sense. Should students care if their college produces above- or below-average earnings? Probably not. Should the government? Possibly — but if the government is going to start judging on these distinctions, it's going to need variables a lot more rigorous than the college's position on the Princeton Review's "reefer madness" list (which the Economist used as a proxy variable for lack of ambition).

Then there's the big methodological flaw: The College Scorecard income data applies to both graduates and dropouts, and college graduates make more than dropouts. Without factoring in a college's graduation rate, it's hard to see how the predicted income could be anywhere close to accurate. (It also doesn't include college graduates who went on to law or medical school, a blow for a college that churns out a lot of future doctors and lawyers.)

When the College Scorecard data was released, the Education Department estimated in a technical note that colleges were responsible for 5 percent of income variation. So the difference a college can make might be huge to its students, but it's small overall.

But they're also doing some things right

The Economist is just the tip of the iceberg. There are probably going to be a lot more college rankings.

For a long time, the US News rankings were the only game in town. Then Washington Monthly started ranking colleges based on their service to society. The New York Times started ranking based on the opportunities colleges provide to low-income students. Money magazine and Forbes have gotten into the game as well.

The Education Department's recent data release is going to lead to many, many more. (We may soon need even more College Rankings.)

So it's worth calling out a couple of things the Economist is at least trying to do right. It's attempting to separate correlation from causation. Factoring in the mix of majors a college offers was a smart, too. The Brookings Institution's value-added rankings essentially give a boost to every college that offers STEM majors, which is fine, but not every student is going to want to study STEM.

This proliferation of data was probably what the Education Department intended. Initially, the plan was for the federal government to rate (not rank) the colleges itself. When the department chose to release the data without appending their opinion, it was betting that think tanks and the media would step in to do the work instead. It looks like that thinking was right.

Sign up for the newsletter Sign up for Vox Recommends

Get curated picks of the best Vox journalism to read, watch, and listen to every week, from our editors.