You have /5 articles left.
Sign up for a free account or log in.

We hear lots of chatter from the cheerleaders of disruption about the “alternative” or microcredential provider universe. They avow that this universe is Trumpian “huge” -- and growing -- and that the alternative sector has the potential to greatly improve outcomes across the board in postsecondary education.

But most higher education commentators and analysts would agree that we know very little about participation rates, completion rates and demographics of students in the microcredential provider universe. My critique of the paucity of data in this alternative universe goes beyond the emptiness of the promises about the sector to reflect the genuine concern of public policy for greater equity in participation and outcomes in postsecondary education.

I have spent some time on the websites of 72 providers and microcredential facilitators, reading the “about” and “programs” sections of those sites, following the links, contemplating the accompanying blogs, and digging out the teaspoons of data one finds under rocks and behind fences.

Greater equity, as we normally understand it, means closing gaps in participation and completion by race/ethnicity, but you will not read those words on the websites of the cheerleaders or in reports on this territory by august bodies. It’s an issue studiously avoided. Gender sometimes; age sometimes. Race/ethnicity, no. The data are not there, and the minor attempts to provide them are pathetic.

The silence is deafening. If one found low participation rates for minority students, that would undercut any claims the “alternative” sector might advance for its contribution to equity in postsecondary education. The federal Integrated Postsecondary Education Data System (IPEDS) does not avoid these data for any students in its 7,000-plus Title IV institutions of postsecondary education, nor does the recently released NCES Household Education Survey (2017). By law, they cannot duck, and don’t.

One must acknowledge the counter to this critique up front: it is not in the mission of the alternative universe of microcredential providers to address any demographic disparities among the populations who pursue education of various kinds after high school. Title IV institutions may have an equity mission; those outside that universe do not, and should not be held to our standards, even in the simple matter of the data they keep.

My stance is otherwise: we are in this together, and if for-profit Title IV institutions keep data, so should the mostly for-profit institutions outside Title IV. All institutions, organizations and enterprises offering education and training to U.S. residents are serving a common population. None can claim exemption from telling the public whom they serve and for what, even if an equity mantra is not in their public mission or “about” statements.

What Stands Out From This Balloon? And What Should We Say About It?

Standing out, principally due to their visibility in the blogs of disruption and the cheerleading columns, are accounts of “skills-based short courses,” MOOCs and other nanocredentials. This ephemeral universe brags more on its websites about job placements, corporate destinations and earnings of “graduates” than the ratio of completers to beginners, i.e., more about what than who.

Do they provide credentials? You sure can get badges from some, and MOOCs will give you some kind of electronic confirmation of completion that you can call a credential (if you pay for it), or, if you complete more than three or four, a nanodegree. But again, these are data deserts.

How do we assess student volume and characteristics in these providers? Where we have reports of enrollments on their websites, they are spun in fantastic round number estimates: 160,000 (www.ucertify.com), 215,000 (www.canvas.net), four million (www.edX.com), “15+” million (www.Lynda.com). As for credential awards, we find much lower, but still mostly rounded estimates: 500 to date (www.fullstack.com); 1,300 to date (www.startupinstitute.com); 1,700 (www.Appacademy,com).

In that universe of 72 providers of such credentials or completions that I followed on their websites, only 10 offered any information on enrollments (including, for our amusement, “hundreds”), and only 12 on completions (principally cumulative “alumni” or a sample of head shots of alumni). Demographics? That’s something just about no one knows because very few of these organizations provide data on anyone to anyone else.

Doubt it? Go online to the websites of App Academy, BadgeOS, Badgr, Bloc, Coding Dojo, Coding House, Degreed, Epicodus, Flatiron School, Fullstack Academy and on and on and see if you can find any data on enrollments or completions and the demography of those who enroll or complete. You can’t.

How Course Report comes up with statements that boot camps will graduate 22,949 in 2017 when only one of the boot camp badge providers, Startup Institute, reports any numbers on enrollments or graduates must remain an eternal mystery. General Assembly claims 35,000 “alumni” but there is a genuine question of what “alumni” means, as it can include folks who register Monday and are gone by the following Wednesday. If we want to see what the recent American Academy of Arts and Sciences report authors call “rigorous research” on this playing field ("The Complex Universe of Alternative Postsecondary Credentials and Pathways"), one has to start with such digging. Did I get it all? No, but it’s not beyond reach.

There is another, and more generic, problem with the data: when you find them in footnoted references, they are estimates based on tiny samples from the putatively authoritative Class Central and Course Report. How tiny? Six hundred and sixty-five in 2015; 1,143 in 2016 -- all self-selected. These are not what statisticians call “true populations.”

By contrast, the recently released 2016 National Household Education survey from NCES started with a weighted sample of 47,000, and the national longitudinal studies run by NCES also start with roughly 20,000 to 25,000 true population students drawn precisely in order to be weighted to convincing national portraits. Beginning Postsecondary Students, Baccalaureate and Beyond, the Education Longitudinal Study of 2002 -- these are not fake news. The footnoted worlds are, and we’ve got to fix that if we are fully to understand what people engaged in learning do after high school.

Then There Are the MOOCs

The MOOC universe is colossal, or so we are told, and nearly always by estimates. So when the providers themselves are the direct source, we are never sure whether numbers such as “15+ million” are cumulative, annual or hallucinatory. Yet from estimates in the Harvard Business Review, we can reasonably assume that only a third of MOOC enrollees are domestic.

If so, should not one ask, and in a pointed way, whether full data accounting by MOOCs and IT certifications in particular, should include populations outside the United States? Coursera offerings come in eight languages, and partner universities are in at least a dozen countries, so whatever one sees of enrollments and completions (something of a zero) is hardly domestic. Again, on Udemy’s website we find an estimate that two-thirds of its (self-estimated 17 million to date) students are outside the U.S.

Failure to deal with country of origin issues leaves us with no way of knowing how this factor clouds interpretation. Offering a geographic distribution in terms of courses offered, not students, as Class Central does, is not very illuminating, either, not if we want to know whether the MOOC and microcredential universe is shrinking gaps in U.S. higher education participation and completion.

For 2014, Canvas.net reported 214,997 enrollments (with age and gender breakouts based on a 20 percent response rate), but we have no idea how much of this was domestic. Not confronting this issue is the most serious data oversight of the American Academy of Arts and Sciences’ recent report on this territory.

As was the case in the universe of boot camps, the only quasi demographic that turns up consistently in the MOOC universe is the proportion of enrollees or credential recipients who had previously earned bachelor’s degrees. Not surprisingly, when that number is either offered or estimated, always in percentages, it’s very high: 64 percent (Udacity.com), 73 percent (www.edX.org), 78 percent (www.turingschool.com), 83 percent (www.startupinstitute.com). What that obviously means is that the alternative credential universe of boot camps and MOOCs (along with IT certifications) is not contributing as much to the national undergraduate completion agenda as some of its cheerleaders would have.

In fact, it can be argued that since such a huge proportion of participants already hold degrees, these “pathways” are not “alternatives” to degrees, either. In turn, that would affect the nonexistent race/ethnicity data, and much to its own chagrin. Let’s be simple about the impact: Can you sell something to minority students in which they are largely excluded -- or don’t even exist -- to begin with? And can you sell that same something to minority students if you cannot demonstrate that it is working for them? That means real, hard data.

How Do We Get Better Data?

Everyone agrees that we’re missing a lot of information from the “alternative credential” universe, a universe that nonetheless claims millions of students. Presumably, we would want to account for these individuals in our national data portraits, if for no other reason than to document the full extent of postsecondary participation and completion, particularly for standard demographic groups that “traditional” higher education accounts for through IPEDS.

What do we do about it? First, we are not going to sweep these providers into the IPEDS system, because they are not Title IV institutions. No federal data collection can touch them. Nor will anyone get what the AAAS report recommends for longitudinal studies, which require student-level data that no organizations other than the National Center for Education Statistics and the U.S. Department of Labor can produce, e.g., for grade-cohort studies such as ELS-02 or event cohort studies such as Beginning Postsecondary Students, and which would come with a price tag north of $80 million. There are a lot better ways to spend that kind of public money.

The NSC Solution

So, I propose using the nonfederal National Student Clearinghouse instead, and NSC is open to the idea. But -- and it’s a big but -- one cannot compel these organizations, ranging from small boot camps to the huge MOOC providers to corporations and associations granting certifications, to report anything (let alone the kind of data outlined below) to anyone. So what do we do about it?

First, we bring together a coalition of the major higher education organizations, chairing it with the American Council on Education. They, in turn, write a joint public email/letter to the cognizant authorities of every noncollegiate boot camp, MOOC offerer, industry certifying authority and mixed course provider, inviting them to submit annual enrollment and completion data to the National Student Clearinghouse, and providing the explicit benefits of doing so.

The principal benefit for microcredential providers, as NSC’s Doug Shapiro points out, lies in participating in a nationally standardized system for student tracking, hence gleaning a mantel of credibility and recognition, along with a trusted partner. For higher education, through its representative national organizations, a new sector of microcredential providers would be recognized instead of hidden, feared, fantasized or berated, and its contributions to national policy objectives made explicit. This is, after all, a large set of providers of postsecondary education that lies outside accreditation, Title IV and federal data systems. Everyone wants to see a full picture, and you are part of it, dear microcredential providers. NSC is the best route out of the shadows.

Shapiro also cites the benefits to microcredential providers for the potential of linking their awards to other labor market indicators in an established system, to which I would add: instead of relying on guesswork or self-selected “alumni” samples or counting “hello/goodbye” people as “alumni.” That, too, raises the issue of the range of information these providers could add to national accounting, and that a letter from the higher education organizations might suggest.

I’d keep the data collection boundaries simple at the outset until the organizations that join get their sea legs, so to speak: domestic enrollments and completions by gender, race/ethnicity, age and prior level of education. If that means the alternative providers have to engage in some institutional research and hire the folks to do it, that’s what it takes to join the universe, receive due recognition, be included in national reporting and see one’s records open to students who otherwise would wonder where their credentials sit in the universe. The opening gambit of the higher education organizations should not frighten the microcredentials away by asking for more.

Beyond NSC: A Second Critical Piece of Data Collection

A second data territory the American Academy of Arts and Sciences report unintentionally raises is that of credit connections between “alternative” providers and institutions of higher education. In the trade press and blogs we get anecdotes and a host of undocumented claims about such activities, but nowhere can anyone find a comprehensive account of who among postsecondary institutions provides credit, from what and for what.

To track the interactions among alternative sources and institutions of higher education requires aggregate numbers, by higher ed sector and “alternative” source to yield statements such as, e.g., 46 community colleges gave additive credit for 291 apprenticeships; 173 community colleges gave additive credit for coding badges, 280 four-year colleges accepted completed MOOC courses for credit, etc. Furthermore, as this is ultimately a student-level question, each granting of credit statement would be accompanied by the requisite demographics: gender, race/ethnicity, age, highest prior level of education.

This is not the territory of the National Student Clearinghouse, though it would require a parallel survey, however specialized, of all credit or credit-equivalent accredited institutions of higher education. Like the NSC case, though, there is no nonfederal authority that can compel institutions to report anything. But there is no other way of documenting and detailing the formal connection between the “alternative” and traditional higher education sectors, and getting a detailed handle on what we are/are not doing for minority students in this sphere.

Thus I take a leap of faith and call for a major higher education organization to step forward, gather others of similar weight and petition NCES for IPEDS to add some very simple data questions to its 2020 survey, questions on the order of “How many of your students were granted credit during the academic year under consideration for course work or course work-equivalent completion from each of the following sources: apprenticeships, industry certifications, boot camps, MOOCs.”

Do it once, in 2020, to see whether we get anything truly worth noting, anything beyond mythology. Do it once to see how well these credit grantings are distributed -- or not. And if institutions know now that these data will be requested in 2020, that gives them enough time to prepare.

I admit this is a hope, not an agreement. All we can do is advocate. Let that advocacy start here.

Next Story

More from Views