You have /5 articles left.
Sign up for a free account or log in.

It's hard to balk at anything in the American Council on Education's new report about graduation rates -- but some experts on higher education data and college completion are disappointed by what's not in there.

The report, "College Graduation Rates: Behind the Numbers," is billed in ACE's news release as "a layperson's guide to the most common databases used to calculate these rates, as well as the advantages and disadvantages of each," and it does that in a way that is likely to strike most observers as factually accurate and balanced.

Many campus administrators and others who deal with outcome measures and use the various databases are likely to recognize the analyses and the by-now familiar critiques of various tools for measuring graduation rates:

  • The main federal database of institution-level information, the Integrated Postsecondary Education Data System, which is useful because it is annual and mandatory, but flawed because it includes only first-time, full-time students, who now make up barely half of all students, and does not collect information on students' socioeconomic status.
  • The National Student Clearinghouse, useful because it collects real-time data and can follow students across institutional and geographic boundaries, but limited because participation in it is voluntary and data from the private organization are not made publicly available.
  • The increasing number of state data systems that are cropping up (driven in part by an outpouring of financial support from the federal government in recent years), which give accurate state-level graduation rates, often include part-time and other students, and cross institutional boundaries, but typically exclude independent nonprofit and for-profit colleges and screech to a halt at state borders.
  • The federal government's non-institutional databases of student progress (the Beginning Postsecondary Students Study and the National Longitudinal Studies), which provide helpful national data but are outdated and don't provide information at the institutional level.

Despite the generally straight-ahead analytical approach, the council's report is not without an agenda, which it does not hide: warning policy makers that there is risk in overemphasizing the extent to which they use graduation rates to judge institutions or higher education generally, as some seem inclined to do amid the intensified political focus on "college completion."

"[G]raduation rates are increasingly becoming a significant part of the accountability conversation on postsecondary education institutions," the paper states. "As this report reveals, there are numerous databases from which to calculate national graduation rates; however, as this report also highlights, no single database can calculate annual, comprehensive graduation rates for all institutions and/or students enrolled in postsecondary education."

"This report highlights the complexities of measuring what many policy makers view as a simple compliance metric with the existing national databases," the report says. Elsewhere it adds: "[A]s the disadvantages of these databases indicate, these data should be used carefully as a measure of the overall productivity of postsecondary education institutions."

Several policy makers and experts on college graduation rates and completion said they could not quarrel with the information contained in the ACE report, which Kristin Conklin, a higher education consultant and former state and federal policy analyst, called a "very long, very accurate collection of all the pros and cons out there" about graduation rate data.

But Conklin said the report represented a "missed opportunity" for the main umbrella group for college presidents to contribute more directly to solving the problems it identifies. "It's like what people on Capitol Hill tend to say about the colleges, 'Stop bringing me problems -- bring me solutions,' " Conklin said. "We know IPEDS is broken. How about saying, 'Here are the things you could to do fix it.' You've got other groups out there, like the National Governors Association, proposing new ways of measuring student progress. I'd like to see ACE say, 'We don't like these changes; we like these.' Maybe that will come next."

Mark S. Schneider, vice president for new education initiatives at the American Institutes for Research and a former commissioner of the National Center for Education Statistics, which operates the federal government's education databases, agreed with Conklin's critique, and said he was concerned by ACE's emphasis on graduation rates, at a time when the governors' association and other groups are increasingly embracing other, shorter-term measures of student success, such as course completion and year-to-year retention.

"To just focus on grad rates is a very limited view of the databases that we need," Schneider said. "Student success is not only about crossing the finish line, it’s about finishing the first lap."

Bryan J. Cook, director of ACE's Center for Policy Analysis and co-author of the graduation rate report, said those critiques were legitimate, but that they misread the report's intent. "It wasn’t our objective to try to pave the way moving forward, but to give people a resource in which to better understand the current environment," Cook said, reiterating a point the report itself makes.

ACE aims to show that none of the current databases does everything that policy makers want, even though the data sources are fulfilling their original purposes. Politicians can continue to try to transform the existing data sources -- essentially putting a square peg in a round hole, he said -- or they can consider creating a new "one-stop resource" that might give them all the tools and data they want. (The Bush administration's Education Department tried to do that when it floated the idea of a national student-level database, a push ACE and some other college groups opposed.)

Both approaches are problematic, Cook said, as "starting from scratch would take a Herculean effort and result in a delay in which you don’t have data," and every time IPEDS is adjusted, "you always end up having these gaps or these holes" in the data that are produced.

"There's a lot to think about as we move forward," he said, "and that's what we were trying to reflect."

Next Story

Written By

More from News