You have /5 articles left.
Sign up for a free account or log in.

The federal government’s long-awaited data on the students enrolled in distance education courses nationwide provide a dubious baseline, a new study suggests, as confusing instructions, inflexible design and a lack of coordination have led colleges and universities to under- or overreport thousands of students.

The study, conducted by the WICHE Cooperative for Educational Technologies and higher education consultant Phil Hill, raises serious questions about the integrity of the Integrated Postsecondary Education Data System, or IPEDS, the higher education data collection program operated by the Education Department’s National Center for Education Statistics.

“After billions of dollars spent on administrative computer systems and billions of dollars invested in ed-tech companies, the U.S. higher education system is woefully out of date and unable to cope with major education trends such as online and hybrid education, flexible terms and the expansion of continuing and extended education,” Hill and Russ Poulin, deputy director of research and analysis for WCET, write in a summary of their findings.

IPEDS first asked colleges and universities eligible for Title IV financial aid about the students enrolled in distance education courses in fall 2012. While it is fair to assume the new data collection initiative wouldn’t be bug-free in its first year, the findings suggest the problems run deeper than rookie mistakes.

“It’s shocking when you think about two things,” Hill said in an interview with Inside Higher Ed. “We’re moving more and more in this country to talking about data, scorecards, holding colleges accountable. It’s this whole culture of data-driven accountability, but we’re not ready.... The second thing that really jumps out is ... that higher education has invested at least $5 billion in [enterprise resource management] systems, yet when we talk to these schools, it’s amazing how many of them do manual reporting.”

For years, reports from organizations such as the Babson Survey Research Group and Eduventures provided the closest approximations of how many millions of students enrolled in distance education courses. When the NCES earlier this year released its preliminary numbers, the agency undercut other estimates by more than a million students. The most recent Babson Group survey, for example, found the total online enrollment in fall 2012 grew to 7.1 million students. IPEDS, meanwhile, counted 5.5 million.

Both Hill and Poulin spent the first half of 2014 combing through the IPEDS distance education data in a series of blog posts. Along the way, they stumbled across some anomalies: a college reporting no distance education students while simultaneously promoting online programs on its website; a university reporting enrollments that suggested each online course enrolled one or two students on average.

The anomalies led to anecdotes. After a representative from a state university system emailed to say the institution had not reported students taking credit-bearing continuing education courses, Hill and Poulin contacted other institutions and found more cases.

The California State University System, for example, didn’t report more than 50,000 students in self-support courses. In a survey of another 20 large four-year institutions, four campuses said their counts were incorrect, and another five were unsure. That means “a few hundred thousand students ... were not counted in just those two systems,” Hill and Poulin write.

The problem appears to stem from confusing wording in the IPEDS survey instructions. Institutions are asked to “Exclude students who are not enrolled for credit. For example, exclude: ... Students enrolled exclusively in Continuing Education Units.” However, institutions should also “Include all students enrolled for credit ... regardless of whether or not they are seeking a degree or certificate.”

Eight of the 20 institutions said they had at least some issues with applying that definition to their data. Hill and Poulin suggested the NCES should streamline the IPEDS instructions to ensure the correct data are being reported.

“A lot of these [continuing education] students were never reported on any survey -- ever,” Poulin said. “Think about enrollment surveys, completion surveys, surveys about ethnicities. If we’re making policy decisions [based on those surveys], it’s a little bit disconcerting that certain people have never been counted.”

Where IPEDS supplies a straightforward definition, institutions sometimes substitute their own. IPEDS defines distance education as a course in which content is delivered “exclusively” online, which is a much stricter definition than what many colleges and universities go by. At institutions accredited by the Southern Association of Colleges and Schools Commission on Colleges, for example, only a “majority of the instruction” is necessary for a distance education course to earn that designation, according to a policy statement.

That means that in addition to many institutions excluding tens of thousands of students from their total distance education enrollments, other institutions are counting too many. As Hill and Poulin surveyed only a couple dozen institutions, it is not clear how far the IPEDS numbers stray from the actual number of students enrolled at colleges and universities in the U.S.

“We need to get some common definitions to the states and the feds and the accrediting agencies, or at least get it a lot closer,” Hill said. “This is silly what we have now.”

The 2012 numbers, they write, “are not a credible baseline.” As some of the institutions have since fixed their reporting issues, the fall 2013 numbers may see some wild swings. That data will be released in January. Cal State, however, will continue not to report students in self-support classes.

"We have not been asked by IPEDS to do otherwise, so when we report distance learning data next spring, we plan on once again sharing only state-supported students," a spokesman for the system said in an email.

But even if all colleges and universities followed the survey instructions religiously, some students would still go uncounted. One surveyed institution, at which students start their courses on any one of a total of 28 different dates throughout the year, estimated that its reported numbers were about 40 percent lower than the actual enrollment.

“With the increased use of competency-based programs, adaptive learning, and innovations still on the drawing board, it is conceivable that the census dates used by an institution (IPEDS gives some options) might not serve every type of educational offering,” Hill and Poulin write.

The NCES did not respond to a request for comment. Based on the time it took for the federal government to include online students in IPEDS, Jeff Seaman, co-director of the Babson Group, said he was hopeful the agency would move faster to sort out the data collection program’s issues.

“IPEDS made a reasonable first effort in dealing with these definitions, but they clearly need further refinement and clarity,” Seaman said. “The concern, however, is not the definitions themselves, but rather the glacial speed by which NCES moves. How is it that there were millions and millions of distance education students before IPEDS got around to adding these questions? This does not bode well for timely improvements.”

Next Story

More from Tech & Innovation