You have /5 articles left.
Sign up for a free account or log in.

Accountants aren’t known for taking risks. So a new experiment from Journal of Accounting Research stands out: an upcoming conference issue will include only papers that were accepted before the authors knew what their results would be. That’s very different from the traditional academic publication process, in which papers are published -- or not -- based largely on their results.

The new approach, known as “registered reports,” has developed a following in the sciences in light of the so-called reproducibility crisis. But JAR is the first accounting journal to try it.

At the same time, The Review of Financial Studies is breaking similar ground in business.

“This is what good accountants do -- we make reports trusted and worthy of that trust,” said Robert Bloomfield, Nicholas H. Noyes Professor of Management at Cornell University and guest editor of JAR’s registered reports-based issue.

Beyond registered reports, JAR will publish a paper -- led by Bloomfield -- about the process. The article’s name, “No System Is Perfect: Understanding How Registration-Based Editorial Processes Affect Reproducibility and Investment in Research Quality,” gives away its central finding: that registered reports have their virtues but aren’t a panacea for research-quality issues.

“Registration is a different system that has its benefits, but one of the costs,” Bloomfield said, “is that the quality of the research article does improve with what we call follow-up investment -- or all the stuff people do after they’ve seen their results.”

In the life sciences and some social science fields, concerns about the reproducibility of results have yielded calls for increased data transparency. There are also calls to rethink the editorial practices and academic incentives that might encourage questionable research practices. QRPs, as such practices are known, include rounding up P values to the arguably arbitrary “P<0.05” threshold suggesting statistical significance and publishing results that don't support a flashy hypothesis in the trash (the “file drawer effect").

Some of those calls have yielded results. The American Journal of Political Science, for example, has a Replication & Verification Policy incorporating reproducibility and data sharing into the academic publication process. Science established Transparency and Openness Promotion guidelines regarding data availability and more, to which hundreds of journals have signed on. And the Center for Open Science continues to do important work in this area. Some 91 journals use the registered reports publishing format either as a regular submission option or as part of a single special issue, according to information from the center. Other journals offer some features of the format.

Bloomfield said he’d been following such developments for years and talked to pre-registration proponents in the sciences before launching his project at JAR, where he is a member of the editorial board. To begin, he put out a call for papers explaining the registration-based editorial process, or REP. Rather than submitting finished articles, authors submitted proposals to gather and analyze data. Eight of the most well-designed proposals asking important questions, out of 71 total, were accepted and guaranteed publication -- regardless of whether the results supported their hypotheses, and as long as authors followed their plans.

Bloomfield and his co-authors also held a conference on the process and surveyed authors who had published both registered papers and traditional papers. They found that the registered-paper authors significantly increased their up-front “investment” in planning, data gathering and analysis, such as by proposing challenging experimental settings and bigger data sets. Yet, as Bloomfield pointed out, registration tended to reduce follow-up work on data once results were known. That is, a lot of potentially valuable data that would have been explored further in a traditional paper may have been left on the table here.

In all, the editorial process shift makes individual results more reproducible, the paper says, but leaves articles “less thorough and refined.” Bloomfield and his co-authors suggest that pre-registration could be improved by encouraging certain forms of follow-up investment in papers without risking “overstatement” of significance.

Feedback from individual authors is instructive.

“The stakes of the proposal process motivated a greater degree of front-end collaboration for the author team,” wrote one conference participant whose registered paper was accepted by JAR. “The public nature made us more comfortable presenting a widely-attended proposal workshop. Finally, the proposal submission process provided valuable referee feedback. Collectively, this created a very tight theoretical design. In short, the challenges motivated idealized behavior.”

Asked about how pre-registration compares to traditional publication, the participant said, “A greater degree of struggle to concisely communicate our final study.” Pilot testing everything but the main theory would have been a good idea, in retrospect, the respondent said, since “in our effort to follow the registered report process, I now believe we were overly conservative.”

Bloomfield also asked respondents how researchers choose which measures and analysis to report and highlight, and what effect it has on traditional published research. Over, participants said this kind of "discretion" was a good thing, in that it was exercised to make more readable of coherent research.. But some suggested the pressure to publish was at work.

“This is a huge problem,” said one respondent. “What does it give the co-author team to provide no-results tests, for example, in the publishing process?” Another said, “Only significant results tend to get published. Potentially meaningful non-results may be overlooked.” Similarly, one participant said, “I find it amazing how just about every study in the top tier has like a 100 hypothesis support rate -- not healthy.” Yet another said that “experiments are costly. I think people use this discretion to get something publishable from all of the time and effort that goes into an experiment.”

Bloomfield’s paper poses but doesn’t answer certain logistical questions about what might happen if pre-registration spreads further. Should editors be more willing to publish short papers that flesh out results left on the table under REP, for example, it asks. What about replications of papers whose reproducibility was potentially undermined by traditional publishing? And how should authors be “credited” for publishing under REP, such as when their carefully designed studies don’t lead to positive results?

Over all, the paper says, editors could improve both the registered and traditional editorial processes by identifying studies that are “better suited to each process, allowing slightly more discretion under REP and slightly less under [the traditional process], clarifying standards under REP, and demanding more transparency" in traditional processes.

The Review of Financial Studies has organized two upcoming issues to include registered reports on certain themes: financial technology in 2018 and climate finance in 2019. Financial technology authors will present at Cornell next month.

Andrew Karolyi, associate dean for academic affairs at Cornell’s Samuel Curtis Johnson Graduate School of Management and the journal’s executive editor, has described the registration process as one that transfers academic risk from the researcher to the journal.

Asked if he thought registration would gain a foothold in business, Karolyi said via email that other journals in his field are following RFS’s experiments.

“There is more work curating these initiatives, but I had a great passion for it so I think less about the work than the outcome,” he said. “I want to believe I and my editorial team did our homework and that we designed the experiments well. Time will tell, of course.”

Philip G. Berger, Wallman Family Professor of Accounting at the University of Chicago and an editor of JAR, said that the journal is considering allowing registered report submissions in the future. JAR is thinking about how the approach might be modified to best suit the field, however, he said, calling it a “two-sided coin.”  
 
In addition to avoiding overstating inferences, Berger said, pre-registration in its purest form is so “inflexible” that it “seems to prevent valid inferences that could arise from a flexible approach to the research question.”  

Next Story

More from Admissions