You have /5 articles left.
Sign up for a free account or log in.

A national outcry regarding the cost of education and the poor performance of institutions in graduating their students has raised questions about the extent to which accreditors are fulfilling their mission of quality assurance. Politicians have expressed outrage, for instance, at the fact that accreditors are not shutting down institutions with graduation rates in the single digits.

At the same time, accreditors and others have noted that the graduation data available from the National Center for Education Statistics’ Integrated Postsecondary Education Data System, familiarly known as IPEDS, include only first-time, full-time student cohorts and, as such, are too limited to be the measure by which institutional success is measured -- or by which accreditation is judged. But simply noting this problem does nothing to solve it. The imperative and challenge of getting reliable data on student success must be more broadly acknowledged and acted upon. The WASC Senior College and University Commission (WSCUC) has taken important steps to do just that.

As is well known, IPEDS graduation rates include only those students who enrolled as first-time, full-time students at an institution. Of the approximately 900,000 undergraduate students enrolled at institutions accredited by WSCUC, about 40 percent, or 360,000, fit this category. That means approximately 540,000 students in this region, including all transfer and part-time students, are unaccounted for by IPEDS graduation rate data.

The National Student Clearinghouse provides more helpful data regarding student success: while including full-time student cohorts, part-time students are also considered, as well as students who combine the two modes, and data include information on students who are still enrolled, have transferred and are continuing their studies elsewhere or have graduated elsewhere. Six-year student outcomes, however, are still the norm.

Since 2013, WSCUC has worked with a tool developed by one of us -- John Etchemendy, provost at Stanford University and a WSCUC commissioner -- that allows an institution and our commission to get a fuller and more inclusive picture of student completion. That tool, the graduation rate dashboard, takes into account all students who receive an undergraduate degree from an institution, regardless of how they matriculate (first time or transfer) or enroll (full time or part time). It is a rich source of information, enabling institutions to identify enrollment, retention and graduation patterns of all undergraduate students and to see how those patterns are interrelated -- potentially leading to identifying and resolving issues that may be impeding student success.

Here’s how it works.

WSCUC collects six data points from institutions via our annual report, the baseline data tracked for all accredited, candidate and eligible institutions and referenced by WSCUC staff, peer evaluators and the commission during every accreditation review. On the basis of those data points, we calculate two completion measures: the unit redemption rate and the absolute graduation rate. The unit redemption rate is the proportion of units granted by an institution that are eventually “redeemed” for a degree from that institution. The absolute graduation rate is the proportion of students entering an institution who eventually -- a key word -- graduate from that institution.

The idea of the unit redemption rate is easy to understand. Ideally, every unit granted by an institution ultimately results in a degree (or certificate). Of course, no institution actually achieves this ideal, since students who drop out never “redeem” the units they take while enrolled, resulting in a URR below 100 percent. So the URR is an alternative way to measure completion, somewhat different from the graduation rate, since it counts units rather than students. But most important, it counts units that all students -- full time and part time, first time and transfer -- take and redeem.

Interestingly, using one additional data point (the average number of units taken by students who drop out), we can convert the URR into a graduation measure, the absolute graduation rate, which estimates the proportion of students entering a college or university (whether first time or transfer) who eventually graduate. Given the relationship between annual enrollment, numbers of units taken in a given year and the length of time it takes students to complete their degrees -- all of which vary -- the absolute graduation rate is presented as an average over eight years. While not an exact measure, it can be a useful one, especially when used alongside IPEDS data to get a more nuanced and complete picture of student success at an institution.

What is the advantage to using this tool? For an institution like Stanford -- where enrollments are relatively steady and the overwhelming majority of students enter as first-time, full-time students and then graduate in four years -- there is little advantage. In fact, IPEDS data and dashboard data look very similar for that type of institution: students enter, take roughly 180 quarter credits for an undergraduate degree and redeem all or nearly all of them for a degree in four years. For an institution serving a large transfer and/or part-time population, however, the dashboard can provide a fuller picture than ever before of student success. One of our region’s large public universities has a 2015 IPEDS six-year graduation rate of 30 percent, for example, while its absolute graduation rate for the year was 61 percent.

What accounts for such large discrepancies? For many WSCUC institutions, the IPEDS graduation rate takes into account fewer than 20 percent of the students who actually graduate. The California State University system, for example, enrolls large numbers of students who transfer from community colleges and other institutions. Those students are counted in the absolute graduation rate, but not in the IPEDS six-year rate.

As the dashboard includes IPEDS graduation rate data as well as the percentage of students included in the first-time, full-time cohort, it makes it possible to get a better picture of an institution’s student population as well as the extent to which IPEDS data are more or less reliable as indicators of student success at that institution.

Here’s an example: over the years between 2006 and 2013, at California State University Dominguez Hills, the IPEDS six-year graduation rate ranged between 24 percent and 35 percent. Those numbers, however, reflect only a small percentage of the university’s student population. The low of 24 percent in 2011 reflected only 7 percent of its students; the high of 35 percent in 2009 reflected just 14 percent. The eight-year IPEDS total over those years, reflecting 10 percent of the student population, was 30 percent.

In contrast, looking at undergraduate student completion using the dashboard, we see an absolute graduation rate of 61 percent -- double the IPEDS calculation. Clearly, the dashboard gives us a significantly different picture of student completion at that institution.

And there’s more. To complement our work with the dashboard, WSCUC staff members have begun work on triangulating dashboard data with data from the National Student Clearinghouse and IPEDS to look at student success from various angles. We recognize that all three of these tools have limitations and drawbacks as well as advantages: we’ve already noted the limitations of the IPEDS and National Student Clearinghouse data, as well as the benefit of the inclusion in the latter’s data of transfer students and students still enrolled after the six-year period. In addition, the data from both IPEDS and the clearinghouse can be disaggregated by student subpopulations of gender and ethnicity, as well as by institution type, which can be very beneficial in evaluating institutional effectiveness in supporting student success.

Pilot work has been done to plot an institution’s IPEDS and dashboard data in relation to the clearinghouse data, displayed as a box-and-whisker graph that provides the distribution of graduation rates regionally by quartile in order to give an indication of an institution’s success in graduating its students relative to peer institutions within the region. While care must be taken to understand and interpret the information provided through these data, we do believe that bringing them together in this way can be a powerful source of self-analysis, which can lead to institutional initiatives to improve student completion.

As noted, WSCUC has been working with the dashboard since 2013. While we are excited and encouraged regarding the benefits of the tool in providing a more complete and nuanced picture of student success, we also recognize that we have a great deal of work ahead of us to make the tool as useful as we believe it can be. After two pilot projects including a limited number of WSCUC-accredited institutions, the required collection of data by all WSCUC colleges and universities in 2015 revealed a number of challenges to institutions in submitting the correct data. The dashboard can be somewhat difficult to understand, especially for institutions with large shifts in enrollment patterns. And unlike National Student Clearinghouse data, dashboard data, at least at this point, cannot be disaggregated to reveal patterns of completion for various student subpopulations.

Such issues notwithstanding, we are encouraged by the value of the dashboard that we have seen to date and are committed to continuing to refine this tool. WSCUC staff members have given presentations both regionally and nationally on the dashboard, including one to IPEDS trainers to show them the possibilities of this tool to extend the data available nationally regarding student completion.

We are hopeful that other accreditors and possibly the NCES will find the dashboard a useful tool and, if so, adopt it as an additional completion measure for institutions across the country. In any case, we will continue to do this work regionally so as to not just complain about the available data but to also contribute to their improvement and usefulness.

Next Story

More from Views