You have /5 articles left.
Sign up for a free account or log in.

Last week, Michael Itzkowitz and Third Way issued a new set of college rankings. They’re helpful, in their way, but I still have misgivings.

On the positive side, the new rankings look at economic mobility of graduates. In other words, they are about results. And while the hyperselective places often do quite well by the few low-income students they admit, the fact that they admit so few counts against them in terms of overall social mobility. (Having attended one, I can attest that the social climate there was set entirely by the very wealthy.) By and large, the schools that scored the highest in the new rankings were Hispanic-serving institutions in California, New York and Texas. They had the right mix of student body and economic opportunity. From a policy perspective, I see a great deal of good to be done by diverting some resources from the überwealthy to schools that educate far more people.

It’s not an unalloyed good, though. Start with the obvious: the survey omits community colleges altogether. There are valid reasons for that, but it contributes to the invisibility of community colleges in discussions of social mobility. That’s not great. And simply by virtue of being a ranking, it plays into the popular and destructive vision of public agencies as competing with each other. By itself, that does harm. Finally, and as an artifact of design, it tends to conflate institutional performance with local demographics; in other words, it recapitulates the error of the more traditional surveys, only upside down. (For example, I don’t see many rural institutions scoring high, likely due to attenuated local job markets. That’s not their fault, but it hurts their rankings.) In neither case are the results something that other schools can use to improve.

But there’s a sense in which the different purpose of this ranking (as opposed to U.S. News and the like) reflects a different intended audience.

Although prestige rankings have all sorts of pernicious effects on policy, they’re intended more as consumer guides. The Third Way rankings are really more about policy; I don’t see them being terribly effective as a consumer guide. The different purpose implies a different audience and therefore a different impact.

From a policy perspective, I’d much rather see an emphasis on helping schools across the country, and at all levels, do a better job. Ranking tends to lead to a sort of lifeboating, in which we assume that only a few schools deserve resources and the rest can be neglected. That’s an unintended consequence, perhaps, but at this point it’s a predictable one.

I’d be more enthusiastic about rankings that are built around the awareness that they’re likely to guide institutional and policy direction and that are consciously designed that way. For example, ranking universities on their transfer friendliness (measured in percentages of credits applied to majors, transfer graduation rates and transfer job outcomes) would be useful both from a consumer-guide perspective and as a policy intervention. It would encourage behaviors that are within the ability of almost any college to adopt, and it would improve the mobility prospects of students all over the country.

All that said, though, these rankings make a point that needs to be made. The Harvards of the world may do quite well with the vanishingly small number of low-income students they admit, but the fact that the number is vanishingly small year after year suggests that making a difference at scale will require going well beyond the elite places. Which it will. If we want to maximize opportunity—a goal I’m not sure is universally shared—we can’t do it on the cheap, or by picking a couple of winners and shaming everyone else.

Third Way’s rankings are welcome, as far as they go. But I’d much prefer to see success be so broadly shared that rankings become unnecessary.

Next Story

Written By

More from Confessions of a Community College Dean