You have /5 articles left.
Sign up for a free account or log in.

The faculty of the Graduate School at Rutgers University in New Brunswick took a stand against Academic Analytics on Tuesday, resolving that administrators shouldn’t use proprietary information about faculty productivity in decisions about divvying up resources among departments, or those affecting the makeup of the faculty, graduate teaching assignments, fellowships and grant writing. They also demanded to view their personal data profiles by Sept. 1. The vote was 114 to 2.

The new resolution is similar to one passed by the faculty of the School of Arts and Sciences in December, in that it expresses concern about the accuracy of the Academic Analytics data and the implications for academic freedom. Rutgers signed a nearly $500,000 contract with the data-mining company in 2013, in exchange for information about the scholarly productivity of individual professors and academic units and how they compare to those at peer institutions. Yet some faculty members who have seen their personal profiles -- an opportunity most professors haven’t had -- say the data are in some cases wrong, under- or overcounting publications. Many faculty critics also say the data lack nuance or accounting for research quality and innovation, and could chill the scholarly inquiry of junior faculty members in particular as they seek to boost their “stats” ahead of applying for tenure.

“The entirely quantitative methods and variables employed by Academic Analytics -- a corporation intruding upon academic freedom, peer evaluation and shared governance -- hardly capture the range and quality of scholarly inquiry, while utterly ignoring the teaching, service and civic engagement that faculty perform,” the graduate faculty resolution says. “Taken on their own terms, the measures of books, articles, awards, grants and citations within the Academic Analytics database frequently undercount, overcount or otherwise misrepresent the achievements of individual scholars,” and those measures “have the potential to influence, redirect and narrow scholarship as administrators incite faculty and departments to compete for higher scores.”

The School of Arts and Sciences’ resolution also demanded that Academic Analytics not be used in promotion and tenure decisions.

Pro-Academic Analytics administrators at Rutgers and elsewhere, meanwhile, say the service is just one tool among many used to track scholarly productivity, and that more information is better information. Even staff members at Academic Analytics say their data shouldn’t ever replace traditional forms of peer evaluation, but rather supplement it with facts, figures and comparisons that institutions might otherwise attempt to gather on their own -- likely less accurately and at greater expense.

“Researchers at Academic Analytics care very much about higher education and we look at ourselves as providing a service,” said Tricia Stapleton, company spokesperson. “We help institutions understand themselves because many are very large, complex beings and it’s not always easy to gather this kind of information.”

Yet Rutgers seems to be acting on some faculty concerns. Richard L. Edwards, chancellor of Rutgers at New Brunswick, said in an email to faculty members last week that he planned on announcing within the next month a mechanism “for individual Rutgers faculty members to review their [Academic Analytics] files and to make corrections if errors are discovered.”

Edwards said he’ll establish a campuswide committee of faculty and administrators charged with monitoring and making recommendations about the program’s use on campus by fall.

Addressing concerns about the cost of the four-year contract, the chancellor said it’s annually about the equivalent of hiring a midlevel analyst. But one person “could not possibly provide the information that we get from Academic Analytics, with data from hundreds of universities and thousands of faculty members.”

David Hughes, a professor of anthropology at Rutgers and president of its American Association of University Professors- and American Federation of Teachers-affiliated faculty union, said that beyond promises of access to faculty data, Edwards’s message fell short. In particular, he said that Edwards had “mischaracterized” Academic Analytics as consistent with the Leiden Manifesto, a sort of gold standard for research metrics -- including “Keep data collection and analytical processes open, transparent and simple.”

“I suspect that when he praises [Academic Analytics] for its transparency, involvement of stakeholders and so on, he is referring to its transparency to and involvement of client administrators -- not their faculties,” Hughes said.

It was only after some effort that Hughes was able to view his profile earlier this year; he said he’d been credited for three journal articles in a given period when he’d only written one, and undercredited on other kinds of publicationsmeaning books? -s***I'm not sure--CFj. Beyond issues of transparency and basic accuracy, Hughes said he also wondered how an anthropologist who made a movie instead of publishing an article would be credited -- if at all. Other professors have expressed concern about how Academic Analytics measures interdisciplinary research and credits co-investigators on grants.

Academic Analytics says its approach is one of the most generous concerning interdisciplinary research, but co-principal investigators are not currently included in the default methodology; institutions must order a custom report that includes them.

Zach Hosseini, a Rutgers spokesperson, said via email that the university “is always looking for new ways to add to the many tools we use to measure our productivity and progress. Academic Analytics is the only tool we found that provides major national research universities like ours the chance to do an ‘apples to apples’ comparison of its programs to its peers. We intensively reviewed all products that could allow us to do this needed comparison and determined that the other tools couldn’t provide the accuracy and scope in data mining we required.”

But data comparisons are just part of how Rutgers assesses faculty productivity and institutional progress, Hosseini said. “We take a holistic view of both, looking at community service endeavors, the impact of our scholarly writing and the quality and quantity of the grants we receive.”

Hosseini said Rutgers saw no issue with the tool’s accuracy, and that it’s “very clear on what it measures and what it doesn’t. The university values the criteria it measures and understands that not every citation or grant in the academic universe will be appear in the tool’s reports.”

Part of Push Toward ‘Academic Intelligence’?

Academic Analytics, based in New York, was founded by Lawrence Martin, former dean of the Graduate School at the State University of New York at Stony Brook, and Anthony Olejniczak, a fellow anthropologist at Stony Brook. Their premise was that colleges and universities needed a more dynamic set of data, updated on an annual basis, than is included in the National Research Council’s periodic rankings of graduate programs. Initial institutional reports were released in 2005, with academic unit-level data. The main intention was for clients to be able to compare output on their campuses to other peer institutions. But over time, staff members said, the service yielded to demands to release professor-level productivity data to keep up with the marketplace. Yet academic unit-level external reviews are still the No. 1 driver of data requests. The company has about 90 Ph.D.-granting institutions as clients and a database of some 380 universities.

Regarding concerns about inaccuracy, Olejniczak said in an interview that Academic Analytics’s data tend to be more accurate and comprehensive that those found in other productivity indexes. That’s because they match up the names of professors obtained in client personnel rosters with their own databases for publications, research funding by federal agencies, citations, conference proceedings and honorific awards, he said. So a chemist who published in a medical journal would get credit even though she hadn’t published in a chemistry journal, for example.

And because these databases are now so vast -- including up to 30,000 journals -- it’s rare that any peer-reviewed publication, even an interdisciplinary or foreign-language one, goes uncounted in some way, he said. Any discrepancy is usually about how something was counted, not whether it was counted.

Olejniczak said he understood that Academic Analytics had a reputation for opacity or a being a “black box,” and said he felt that the “community” aspect of the operation is often neglected by critics. Institutions are welcome to suggest new publications to add to the algorithm, and the enterprise benefits as a result. At the same time, he acknowledged that contracts are negotiated with administrators, not professors, and the company’s main point of contact at any institution always resides somewhere in the central administration.

Asked whether Academic Analytics was philosophically opposed to being 100 percent open to faculty members, Olejniczak said no, but that the company had to balance transparency with commercial viability. That is, some of the information must remain proprietary.

Additionally, Rutgers’s contract with Academic Analytics, obtained by the union though a public open records request, says that the portal may be accessed only by those who hold “a position that involves [strategic] decision making and evaluation of productivity,” as approved by the company. The contract also limits what data may be distributed or shared.

Overall, Olejniczak said he thought that faculty opposition to Academic Analytics stemmed from basic discomfort with being measured.

“There was always inevitably going to be some pushback,” he said. “But I think this is a natural sort of evolution as academic intelligence becomes de rigueur in the U.S.”

Deepa Kumar, vice president of the Rutgers faculty union and a professor of communications, said she remained unswayed by such arguments. She said Tuesday’s meeting, at which a number of faculty members across the arts and sciences spoke out against Academic Analytics, was the start of a greater resistance against the program. Indeed, the national AAUP recently released a statement urging caution against the adoption of Academic Analytics and similar services.

The AAUP statement noted a 2015 report from the Higher Education Funding Council for England, where use of research metrics is now required at public institutions, that found “considerable skepticism among researchers, universities, representative bodies and learned societies about the broader use of metrics in research assessment and management.” Data points can be misused or “gamed,” the study says, and as underlying algorithms remain fragmented, “it is not currently feasible to assess research outputs or impacts … using quantitative indicators alone.”

Kumar said the next step -- likely next year -- is to campaign not just for the limited application of Academic Analytics or access to data but the end of its use on campus.

“Using this data to make decisions about allocation of resources to departments and schools is something that has serious consequences,” she said. “This is simply not one extra form of measurement.”

Next Story

More from Academic Freedom