You have /5 articles left.
Sign up for a free account or log in.
Blame it on PayScale.
Ten years ago this summer, the compensation data firm began publishing data on the colleges whose graduates earned the highest salaries. “For what it costs, a B.A. degree might as well be made of gold,” the company’s first report said. (It also noted that in the 2007-08 academic year examined then, the price of a four-year public degree averaged $6,185, and “costs at private colleges and universities can skyrocket beyond $33,000 for tuition, room and board.”
Today the former figure stands at nearly $21,000, and the sticker prices for a year at the most expensive private universities now start with a six. So it isn’t surprising that PayScale has expanded the data it publishes about higher education and been widely mimicked. Rankings providers like Money and Forbes have incorporated the data into their formulas, the American Institutes for Research's College Measures is working with numerous states to produce their own measures of college economic payoff, and even the federal government, in its College Scorecard, has included a measure of postcollege earnings in the outcomes data it provides.
The newest entry into the mix, The Equality of Opportunity Project, uses graduate earnings data to show how well (or poorly) colleges help their graduates climb rungs on the country's economic ladder.
Many college leaders dislike the metric, but the public eats it up -- and PayScale feeds its appetite.
PayScale today releases its 2017 College ROI Report, which provides information on the return on investment -- the 20-year compensation advantage gained by attending that institution -- for the typical graduate of 1,400 public and private nonprofit colleges. As is our practice, Inside Higher Ed does not report on the results of this or any of the burgeoning number of other rankings of colleges, given the skepticism with which most informed observers view their methodologies. This year's report from PayScale, like many such studies, shows engineering and science-oriented colleges having the best ROI, and sees a significant edge for public institutions, where the costs of attendance are much lower than at private nonprofit ones.
But given the PayScale data’s widespread use and the interest in ROI that the company’s approach has both helped spawn and capitalized on, the 10-year mark represents an appropriate time to look at its evolution and influence.
Most experts agree that the PayScale report has improved since its inception, and that the company has made changes over the years to address some of the criticisms directed its way, for instance by significantly refining how it calculates how much students at a given institution spend on their education. "They are definitely trying to do the right thing," said Robert Kelchen, assistant professor of higher education at Seton Hall University.
But some of the PayScale metric's fundamental flaws remain: because the company bases its data on voluntary survey reports, its samples for certain colleges and majors may not be representative. And its institutional rankings are heavily influenced by the makeup of the colleges' programs, favoring institutions whose programs lean toward high-paying fields.
Perspectives on PayScale
Critiques of data like PayScale's range from the broadly philosophical to the narrowly practical.
Many people in higher education just plain don't like the idea of measuring a college education primarily (or even significantly) through graduates' income. "PayScale has shoved aside the philosopher king as the arbiter of the worth of college," Patricia McGuire, president of Trinity College Washington, wrote in an Inside Higher Ed essay late last year arguing for a new way of defining higher education's value proposition.
As an expert on higher education finance and now provost at the University of San Francisco, Donald E. Heller sympathizes with McGuire's argument, but he concedes that "the horse is out of the barn" in terms of measuring the value of higher education at least partially by graduates' economic outcomes.
"The first thing I say is to acknowledge that college is a lot more than what you're going to earn, and it's important that we keep that in context," Heller said. "But people are very interested in this idea of ROI. When I go to admissions events, parents often ask questions like, 'Is my kid going to get a job that will allow them to pay off debt and not live in my basement?'"
If many college administrators are (at least grudgingly) accepting the idea that their campuses are going to be judged in part by such measures, they still very much insist that the data should be meaningful. And on that front, experts continue to cite problems with the PayScale data (as they do with virtually all sources of such data, including the College Scorecard). The company derives its data from individuals (about 150,000 a month) who voluntarily submit their compensation information to use one of the company's services. The college ROI data are not part of the company's core business, but they give PayScale visibility.
One key issue is the data's representativeness. Tod Massa, policy research and data warehousing director at the State Council of Higher Education for Virginia, whose own longitudinal data system also produces earnings data about graduates, notes that the new PayScale college-level data are based on 1.3 million respondents over a decade, from 2007 to 2017. "I just don't know how representative that is," he said.
That's especially true when you drill down to the institution and major level, said Heller of San Francisco. "Some of the institutions in the top 15 have fewer than 100 data points, and all self-reported," he said. "It's hard to know if it is at all representative of the college." He said PayScale should be "much more honest" and transparent about the "severe limitations of the data."
Katie Bardaro, vice president of data analytics at PayScale, acknowledges that the company cannot change its underlying approach to data collection but insists that it is constantly subjecting the information to "rigorous validation." The company compares college-level data to those from the federal government's Integrated Postsecondary Education Data System, for instance, to "make sure we don't have over- or underrepresentation of [students in] a given major," and has begun comparing its data to those from the federal government's College Scorecard to ensure comparability.
Such comparisons are difficult because of differences between who is counted, though; the College Scorecard isn't a voluntary sample, like PayScale's information, but it includes only people who received federal financial aid. PayScale includes only people who have a bachelor's degree (no advanced degrees) -- excluding the many graduates of liberal arts programs who go on to earn graduate and professional degrees that contribute to their career and earnings success; the College Scorecard includes those without a degree at all. And many state-level databases, like Virginia's, usually don't track graduates who ultimately leave the state for work.
Kelchen, of Seton Hall, agrees that the PayScale data are too limited to be leaned on too heavily. "I would hesitate to view the PayScale data in isolation," he said.
But "if we triangulate them with the College Scorecard data and the state data systems, we might really begin to get a sense of how things look," he said. "If those data sources match up, I'd feel pretty comfortable."
The ultimate vision behind the College Scorecard (assuming it survives the Trump administration, which may be unlikely to support anything with Obama administration fingerprints on it) is for it to have data at the program level as well as the institution level, which would make its information more comparable to PayScale's and to some of the state-level databases. "Getting the Scorecard to the point where you could disaggregate by majors would give us a lot more confidence," Heller said.
"Until one of us gets perfect information, yeah, I think it's fair that we should be trying to triangulate," said Massa of Virginia.