You have /5 articles left.
Sign up for a free account or log in.

What if we ranked universities not by inputs but by outputs?

Not by admissions selectivity or even by their contribution to social mobility, but, rather, their impact on the growth of knowledge and technological and scientific advancement?

A recent piece in Forbes echoes an argument that I made several months ago: that the pre-eminence of many of the most highly regarded American universities is fading.

As the Forbes contributor, Matt Symonds, points out, Ivy League admissions may never have been more selective, but their place in the global rankings tells a very different story. There is only one U.S. Ivy in the top five (Harvard) and only three others in the top 10 (Columbia, Princeton and Yale). Brown, despite having this country’s oldest applied mathematics program and the Ivy League’s oldest engineering program, isn’t in the top 30, and Dartmouth isn’t among the top 50 U.S. and Canadian universities.

So which U.S. universities do stand out in terms of research citations and other measures of research productivity? Caltech, MIT, UC Berkeley and the University of Chicago. Then there are the U.S. and Canadian institutions that are rapidly climbing up the rankings: Toronto and the University of Alberta, Emory, Southern Cal and Vanderbilt—not the campuses that automatically come to mind.

The most innovative universities have a distinctive ethos, combined with a rather remarkable humility, that isn’t found, I fear, at the Ivies. That ethos, perhaps most obvious at MIT, involves relentless, disruptive innovation. There’s always a sense of wanting to do more, be better and keep on exploring. Contributing to this ethos is an engineering, hands-on mind-set. But it is also part of not being an “Ivy” … as MIT was, historically, a start-up, with its roots in a much more “blue-collar” mentality.

One reason why there is so much horror whenever MIT becomes embroiled in the culture wars—see an example in a recent Wall Street Journal article—is because alumni and others are terrified that this will slow MIT down or change its driving culture. So far, of course, that’s far from the truth, as MIT’s glory days, the past 20 years are so, have coincided with it becoming far more diverse in every way.

I should add that neither U.S. News nor the educated public has much understanding of what a first-tier, cutting-edge English or history or sociology department would look like. Would it be judged on scholarship, influence or some other elements? It could be that the traditional fields no longer hold sway or as much value (in the eyes of the public) as they once did, in large part because they are not the modern currency that are a part of everyday life.

Another recent article, by the Stony Brook finance professor and prolific economics blogger Noah Smith, looks at the leaders in artificial intelligence research, which seems likely to be the cutting-edge technology of the future. At the top of the list is Google, which is followed by Stanford, Carnegie Mellon and MIT, then, following Microsoft, Berkeley, Columbia, Oxford and Tsinghua. Cornell is 11th and Princeton is 13th. Public universities, including Texas, UCLA, Illinois, Georgia Tech and Washington, outstrip the other privates.

Smith’s essay raises troubling questions about whether Google, despite its dominance of AI research, is treating this technology with the urgency it requires. In contrast, OpenAI and its partner Microsoft are striving to transform AI into a viable business, while Google seems to be treating this more like one of its “fun side project[s]”—like Google Books and Ngrams.

All this raises a question: Have our premier institutions become complacent, smug and self-satisfied? Has their audacity and boldness faded along with the impetus to shatter existing paradigms?

It’s noteworthy, I think, that while U.S. college rankings measure selectivity or reputation or resources or contributions to social mobility, these tools don’t assess program quality or productivity or contributions to breakthrough knowledge.

You may have read a recent article in Nature that claimed that “disruptive science” has declined—that “the proportion of publications that send a field in a new direction has plummeted over the past half-century.” At a moment when new mRNA vaccines, electric cars and text and image generators proliferate, such an argument may seem grossly overstated.

But the Nature article is not alone in claiming that disruptive innovation is in retreat. There are also signs that entrepreneurship and mobility have faded, too.

Applied more broadly, I do worry that certain kinds of breakthrough research may be stagnating. In my own field, the number of pathbreaking, paradigm-shattering U.S. history books appears to have declined precipitously and fewer new fields of study seem to be opening up. It was a journalist, Nikole Hannah-Jones, not an academic historian, who provoked the biggest debate in my field in recent years. When I think of the books that busted conventional thinking about the history of race in the United States, many that come to mind are by nonacademics, like Isabel Wilkerson’s Caste.

Somewhat similarly, would anyone argue that literary studies has produced as many trailblazing works as the 1970s or that the field is as dynamic and disruptive as it was then?

Great scholarship continues to appear, but conceptual, methodological, analytical and theoretical breakthroughs seem to have faded—even though exciting opportunities for quantum leaps, for example, in comparative studies, abound.

One might attribute this quiescence in the humanities to an aging professoriate or to shrinking doctoral programs and departments and increasing reliance on adjuncts. Maybe we have a publicity problem, as avant-garde and cutting-edge scholarship fails to get the exposure it deserves. Or, more likely, the humanities fields have gravitated toward what Thomas Kuhn termed “normal science”: scholarship that takes place within a settled paradigm.

I fear that scholarly torpor and a repetition compulsion beset the fields I am most familiar with. After all, the key interpretative and explanatory frameworks, including deconstruction, the new historicism, the new social and cultural history and even critical race theory, are now 60 years old. Much of what strikes journalists as novel and trendsetting is in fact old hat to anyone who has followed the field closely since the early 1970s.

Previous analytical paradigms didn’t last nearly so long. In U.S. history, the early-20th-century Progressive school of historical interpretation, with its stress on class and sectional conflict and the purported struggle between reform and reaction, democracy and special privilege, and agrarianism and capitalism, lasted a couple of decades, with a brief resurgence during the Great Depression. Consensus historiography, rooted in the idea that this country’s political, ideological and economic battles existed within a rather narrow liberal, individualistic and capitalist ideological spectrum, lasted for little more than a decade and a half. As for New Left history, with its emphasis on racism, imperialism, poverty and the underclasses, this school rose and fell over the course of little more than 10 years.

Not so for the schools of interpretation that arose in the 1970s: feminist, neo-Marxist, poststructuralist, postcolonial and postmodern. The cultural turn, the emphasis on discourse, their influence lingers. Indeed, I’d say, predominates.

Two decades ago, Jerome McGann, a University of Virginia authority on literary and cultural history, published an essay in Critical Inquiry that spoke of a malaise in humanities scholarship that had persisted for more than a decade, driven, in part, by the pressure on junior scholars to publish quickly, frequently and within recognizable formats. If that was true in 2004, it’s even more true today, as the competition for jobs has intensified exponentially.

McGann also expressed concern about the failure to train junior scholars in the skills that 21st-century research and scholarship and presentation formats demand—and that might drive inquiry and analysis into fresh directions. True then, this is even more obviously the case today. In my field, few doctoral students receive any training in data analysis, demography, family reconstitution, econometrics, or even the latest digital technologies. Let’s not impose blinkers and blinders on future scholars.

Am I wrong to worry that the humanities are treading water? Are the humanities’ capacity for disruptive innovation exhausted? Are its fields in the midst of an existential crisis, like that described in F. Scott Fitzgerald’s poignant words, with all gods are dead and all wars fought?

I certainly hope that isn’t the case. But I challenge the new generation of humanities scholars to prove me wrong.

Steven Mintz is professor of history at the University of Texas at Austin.

Next Story

Written By

More from Higher Ed Gamma