clock menu more-arrow no yes

Why a new ranking says the University of Colorado Denver is the best college in the US

One of the buildings at the University of Colorado Anschutz Medical Campus, part of UC Denver.
One of the buildings at the University of Colorado Anschutz Medical Campus, part of UC Denver.
(Joshua Lott/Getty Images)

Many groups are now trying to rank colleges based on how much their students earn later in life — and how much they earn relative to what you'd expect. And every time, the answer is different.

The Economist says Washington and Lee University does the best job at getting students higher-than-expected salaries later in life. The Brookings Institution says it's the Albany College of Pharmacy and Health Sciences.

Now the Georgetown Center on Education and the Workforce has produced rankings of its own, saying that it's the University of Colorado Denver that actually does the best at getting average students higher-than-average salaries.

These proliferating rankings are the result of new federal data showing how much colleges' students earn 10 years after they first enroll. And they're meant to adjust for the same fact that makes the entire undertaking somewhat bogus in the first place: Colleges themselves are responsible for only the tiniest fraction of the variation in students' earnings. Ranking them on those outcomes puts all the emphasis on something they can barely control.

Why MIT, Harvard, and the University of Colorado Denver came out looking best

All these rankings are based on the Education Department's College Scorecard, a trove of federal data on how much college students earn 10 years after they first enrolled.

The Education Department released this data as part of a broader push to better define the economic value of higher education. Most students go to college to get a better job and earn more money; politicians push the value of higher education not just to expand students' minds but also to guarantee them a better life after graduation. The salary data, which includes both college graduates and dropouts, is supposed to help policymakers and the public pick out the colleges that deliver, or don't, on those expectations.

Georgetown used the information to rank colleges a few ways:

  1. Based on the raw salaries students earn 10 years after they first enrolled. MIT came out best, with the US Merchant Marine Academy, Harvard, Georgetown, and the Stevens Institute of Technology rounding out the top five.
  2. Then they adjusted for what students major in, since students in certain majors, such as engineering, end up earning considerably more. After adjusting for majors, Harvard students had the highest salary, followed by Georgetown, the University of Colorado Denver, MIT, and Stanford.
  3. Finally, they adjusted based on both the major mix and on students' educational qualifications and their rough likelihood of having a graduate degree. That's where the University of Colorado Denver came out looking best, followed by Georgetown, the University of the Pacific, Harvard, and Washington and Lee.

But there are reasons to be skeptical of this conclusion — and, more broadly, of the entire business of ranking colleges based on the federal earnings data.

Colleges themselves just can't affect earnings that much

Colleges themselves aren't the only factor — or even a particularly important one — in student earnings. The College Scorecard data included a huge caveat: that colleges themselves are responsible for about 5 percent of the variation in students' earnings later in life.

That's why organizations like the Center on Education and the Workforce end up coming up with complicated formulas to adjust for different variables. Students who went to MIT might end up earning the most later in life, but that's a reflection of many things that have little to do with the colleges' academic programs: the race and gender mix of their students, the majors they choose, the careers they pursue, the geographic areas where they tend to settle, and so on.

So ranking colleges based on earnings alone, even if groups try to adjust for the other factors that could play a role, makes a big deal out of a statistic that colleges have very little control over.

And it's easy to imagine that relying too much on these findings could lead to bad policy decisions, because it's so hard to isolate, and reward or punish, colleges' strategies that lead them to do well on these lists.

Take the University of Colorado Denver, which tops the latest rankings. This is particularly surprising because more students at the university drop out than graduate, and the data used in the rankings includes both graduates and dropouts. This suggests either that there's some kind of error in the university's reporting or that its dropouts end up faring particularly well financially — which is good news for the dropouts, but probably doesn't have anything to do with the university itself.

Or look at Washington and Lee University, which performed well on rankings from both Georgetown and the Economist. But it also has another distinction: Historically, it's admitted very few students who get Pell Grants, the government grant program for low-income college students. In 2007, just 3 percent of students received that form of aid. (This proportion has increased in recent years.)

The students at Washington and Lee are earning high salaries later in life, according to those rankings, but also spent their college years in an environment in which nearly everyone was from a middle-class or wealthier family — and that probably had an effect on their earnings later in life that even the most sophisticated formula won't necessarily cover.

College rankings make great entertainment. They don't necessarily lead to good policy.

Sign up for the newsletter Sign up for The Weeds

Get our essential policy newsletter delivered Fridays.