You have /5 articles left.
Sign up for a free account or log in.

Rankings abound that order colleges in part (or largely) by the average salaries of graduates. Consider this one and this one and this one. And of course there is the College Scorecard, created by the Obama administration and quite possibly continuing, with data on (among other things) graduates' average earnings. Some producing these analyses stress that they should be but one part of a student's decision making, but many others rush to publicize these reports.

Many educators debate the value of the data -- which calculation is most accurate or whether salary statistics in any form are a sound way to judge colleges.

Institutions that educate first-generation and minority students note the assistance many wealthier (generally white) graduates receive when starting out. Advocates for the liberal arts note that their graduates do quite well in life but typically are outearned in the early years after graduation. The complaints are many. Yet in theory, those doing rankings of various sorts say that students need to know which majors or institutions will make them likely (or not) to earn high salaries.

But even as educators and those doing rankings argue over the issues, is it possible that an underlying assumption -- that prospective students want and will use such salary data -- is incorrect?

Those who wonder may want to consider a report issued by the Urban Institute last week. The report described a three-year project in which the institute developed a "personalized tool" with information on salaries by program, as well as the ability to get information on net price (or what students actually pay) for various programs. A control data set was also created, without any of the salary information, just basic information similar to that generally available. The experiment was conducted at 25 Virginia high schools.

The results may disappoint those who maintain students are clamoring for this information.

Usage rates of the tool were low and no greater than at high schools with only the control set of data, the report said.

"We do not find any evidence that receiving access to the treatment site had a detectable impact on students’ behavior, based on the colleges and majors they chose immediately after graduating from high school," the report said. "Students from schools randomly assigned to receive the treatment version of the intervention did not choose institutions and majors with higher average wages, higher graduation rates or lower net prices than students from schools in the control group."

The report doesn't rule out the possibility that high-quality information might influence students in the future. It suggests that much more attention needs to be paid to how such tools are designed. Further, it suggests that they may work best if integrated into tools that students and families are already using. Many students and families may be at overload level with regard to tools, the report suggested.

The Urban Institute doesn't rule out that data could be useful to students, but faults the idea that just putting information out there will make a difference.

"The findings of this study indicate that simply publishing earnings data on an easy-to-use website is unlikely to change the higher education decision making of prospective college students," the report said. "There did not appear to be significant pent-up demand for this information in our Virginia pilot, despite its intuitive appeal to policy makers and researchers."

And the report added that whatever the flaws of sharing this information with students, that doesn't mean the information cannot be of value to policy makers.

A blog post published by the institute -- by Kristin Blagg, one of the report authors -- said that the College Scorecard shouldn't be abandoned, but should be improved. And the post reiterated the point about value to policy makers.

"These data are valuable for monitoring the outcomes of colleges and universities," Blagg wrote. "Efforts to protect consumers from low-quality colleges and ensure that taxpayer dollars are wisely spent can benefit from better data on labor market outcomes. And earnings data can also become part of how institutions benchmark themselves against peers, which most already do."

Next Story

Written By

More from Traditional-Age