You have /5 articles left.
Sign up for a free account or log in.

Science is an inherently social enterprise. Progress only occurs when new results are communicated to and accepted by the relevant scientific communities. The major lines of communication run through professional journals and the double-blind peer review process. Academic journals are also a main currency of scholarly success, as publication in a top journal can be a make-or-break career moment for a researcher.

Because of their central role in academic communication and career advancement, journals help set the rules of how research is evaluated and rewarded. At the American Journal of Political Science (AJPS), we work closely with our partners at the Odum Institute for Research in Social Science at University of North Carolina-Chapel Hill and the Qualitative Data Repository at Syracuse University to promote reproducibility and transparency as cornerstones of high-quality research.

While the political science discipline has long paid lip service to the importance of these issues, the AJPS’s Replication & Verification Policy requires scholars to “practice what we preach” by incorporating reproducibility and data-sharing into the academic publication process.[1]

More on Reproducibility

  • July 7: Paving the Way to More Reliable Research
  • Today: Should Journals Be Responsible for Reproducibility?
  • Is the Tenure System Hurting Science?
  • Can Qualitative Research Catch Up on Data Transparency?
  • Can Better Training Fix the Reproducibility Crisis?

Our goal is to establish a standard for the information that must be made available about the research that appears in our journal. By requiring scholars to provide access to their data and conducting our own replications on those data, we confirm the rigor of, and promote public confidence in, the studies we publish. As one of the top journals in the discipline, we hope to create state-of-the-art standards that others in the field will aim to adopt.[2]

Below are some of our experiences so far and lessons for journals interested in promoting the reproducibility and transparency of the work they publish.

Opening Up Data

In 2012, the AJPS became the first political science journal (to our knowledge) to require authors to make the empirical data underlying their analyses openly accessible online. By making data sharing a requirement for publication, we can guarantee that replication materials for all AJPS articles will be available to the research community in a single, easily accessible location: the AJPS Dataverse, an online data repository.[3]

While the creation of the AJPS Dataverse was a critical first step, it became clear that scholars needed more guidance and support to ensure replication files were reliable and consistent. In 2015, we released guidelines on what information authors are required to share about their quantitative research:

  • the dataset analyzed in the paper;
  • detailed, clear code for using this dataset to reproduce all tables, figures, and exhibits in the paper;
  • documentation, including a README file and codebook, that provide contextual details about the dataset and replication materials;
  • information about the source of the dataset; and
  • instructions for extracting the analysis dataset from original source data (e.g. recodes, data transformations, handling missing observations, etc.).

With these materials, any researcher should be able to reproduce the empirical results published in any AJPS article.

In 2016, we extended the guidelines to cover replication materials from qualitative and multi-method research.[4] Our transparency policy is not “one size fits all,” and it respects the diversity of data formats by providing different requirements for different kinds of qualitative data. Yet the ultimate goal of our guidelines is the same regardless of the type of data used and how they are shared: to clarify how the information underlying an article was obtained and how it was used to generate the findings and conclusions reported in the article.

Ensuring Accuracy

At the AJPS, we have used our influence as a key actor in the academic publishing landscape to make research data more open and accessible. However, we believe that part of our responsibility as a leading scientific journal goes beyond simply requiring scholars to make their data available; it requires us to ensure the accuracy of the results we publish.

Therefore, we verify the replication materials that authors provide in order to guarantee that they do, in fact, properly reproduce the results described in the corresponding AJPS article.

We believe that verifying results is a central part of our commitment to publishing the highest quality research in political science. We also maintain that, in the interest of objectivity, it is best to have a third party carry out the verifications. We have partnered with the Odum Institute for Research in Social Science to verify quantitative analyses and the Qualitative Data Repository (QDR) to verify qualitative analyses.

Acceptance of a manuscript for publication in the AJPS is contingent on successful replication of any empirical analyses reported in the article.[5] After the author submits his or her final article draft and uploads the replication files to the Dataverse, staff from the Odum Institute or the QDR curate the replication materials to make sure they can be preserved, understood, and used by other researchers.

Staff then carry out all of the analyses using the computer code, instructions, and datasets in the replication files, and compare their results to the contents of the article. If problems occur, the author is given an opportunity to resolve the issues and upload corrected materials to be re-checked.

This process continues until the replication is carried out successfully. At that point the manuscript receives its final acceptance for publication and the replication files are made publicly available on the AJPS Dataverse.

To publicize our efforts and provide some recognition to authors who comply with our policy, we have adopted two of the “Badges to Acknowledge Open Practices” from the Center for Open Science. These badges are designed to reward research that conforms to best practices for scientific openness and transparency.

Any manuscript that has successfully completed the AJPS replication and verification process automatically meets the criteria for the Open Data and Open Materials badges, which we include on the article itself and on the Dataset in the AJPS Dataverse.

The Benefits & Costs of Verification

Since the launch of the AJPS Dataverse in 2012, 268 datasets have been uploaded. We began verifying replication materials in 2015 and 95 manuscripts have successfully been verified so far, all involving quantitative research.[6]

Scholars have been highly receptive to our new standards and procedures, despite the additional time they require. We have received complete cooperation, and often enthusiastic support, from our authors. Requests for exemptions have been based on practical considerations, not philosophical objections to our goals.[7]

There are, however, costs associated with the replication and verification process. It adds an average of 53 days (the median time) to the publication workflow. The process typically involves one or more rounds of feedback and resubmissions of the replication materials.

So, much of the turnaround time involves authors revising their materials. The median time from submitting the replication materials to the initial verification report from the Odum Institute is 20 days. So far, three-quarters of the studies have completed the entire verification process in three months or less.  

The verification process is also relatively labor-intensive. On average, it takes 8 person-hours per manuscript to replicate the analyses and curate the materials for public release. The financial cost of the verification process is not trivial, and is covered by the Midwest Political Science Association, the professional organization that owns the AJPS.

The vast majority of authors have to correct and resubmit their replication materials at least once (the mean number of resubmissions is 1.7). Most of the resubmissions involve incomplete replication materials; requests for additional information, such as more detail in codebooks; or minor inconsistencies between the replication results and the manuscript. In virtually all cases, authors have been able to make the necessary corrections and adjustments relatively easily, without requiring major changes to their manuscripts.[8]

Thus far, our verification process has not revealed many major research flaws, but rather has served as a mechanism for ensuring that all the materials necessary for replication are in place and can be executed successfully by a third party. In our experience, active verification of the replication files is necessary in order to guarantee the validity of the final materials and ensure they are ready for review and reuse by others. Most of the problems our process has identified would not be found if we merely checked for the presence of the required replication files without re-running the analyses.

Changing the Conversation

As one of the most prestigious general-audience publications in political science and the broader social science community, the AJPS has the opportunity to shift the standards for what constitutes high-quality research and what role journals should play in encouraging reproducibility and transparency.

The Replication and Verification Policy is one of the features that sets the AJPS apart from other publications and provides tangible evidence of the high quality of the work we publish. We, quite literally, open up the research process to full scrutiny from the scientific community. 

We are hopeful that our Replication and Verification Policy will serve as model for best practices that other journals can adopt. At the same time, we recognize that adapting a journal’s workflow in this way is an expensive proposition. The AJPS is fortunate to be part of an organization that has the resources to support this policy, but financial realities may not make it feasible for other journals.  

Nevertheless, financial barriers do not mean that journals must entirely abandon their support for the same general principles of transparency and reproducibility. The Center for Open Science has a set of Transparency and Openness Promotion (TOP) Guidelines that outline varying levels of replication policies, ranging from simple disclosure about data through data sharing to full verification.

The TOP guidelines can help other journals adopt the particular mix of data access and research transparency policies that are most appropriate for their own needs and resources, and can be adjusted over time as journals progress toward greater transparency and more stringent standards.[9]

The ideas motivating the AJPS policy embody a central tenet of the scientific method – that results can be tested and reproduced by others. The policy not only guarantees the quality and transparency of the studies we publish, it also provides an invaluable resource for teaching and further research. We believe that the benefits for the scientific community outweigh any costs to authors, the editor, or the publisher.

Replication and verification policies promote the integrity of scientific research and, as such, should be made a routine part of the academic publication process. If more journals were to adopt similar policies, we could make real progress toward solving many of science’s current challenges around reproducibility, transparency, and openness.

William G. Jacoby is editor of the American Journal of Political Science and professor of political science at Michigan State University.

Sophia Lafferty-Hess works as a research data management consultant at Duke University, providing instruction and guidance on data management strategies, tools, policies, and best practices. 

Thu-Mai Christian is the assistant director for archives at the Odum Institute for Research in Social Science at the University of North Carolina at Chapel Hill.

[1] While reproducibility and replicability are often used interchangeably, the National Science Foundation defines them differently: Replication is using the same processes and methodology with new data to produce similar results, while reproducibility is using the same processes and methodology on the same dataset to produce identical results. We use the term “replication” frequently in this article because it is the term used in the AJPS’s official policy statement, website, and other communications materials, although we will soon be adjusting our nomenclature to be consistent with the NSF’s definitions.

[2] These policies have been established with the full support of the Midwest Political Science Association, our sponsoring organization.

[3] Authors can apply for an exemption from AJPS’s data-sharing requirements if there are restrictions on making their data public due to contractual arrangements, government regulations, or human subject protections. They are still required to upload other replication materials, such as computer code, to the Dataverse, as well as information on the procedures that interested researchers would need to follow to gain access to the restricted data. Authors cannot receive exemptions from providing replication files in order to “embargo” the data for their own future use. However, they are only required to provide the specific variables and observations that were used in the analysis for their article. Any other variables or observations contained in the source data do not have to be shared.

[4] Works in normative and formal theory are generally exempt from our replication policy because they do not analyze empirical data.

[5] Authors exempted from the AJPS’s data-sharing requirements (see note 3 above) are still strongly encouraged to provide their full datasets to us for replication. In this way, we can assure AJPS readers that the analyses reported in an article can be reproduced, even if the data necessary for doing so cannot be made publicly available.

[6] These are the data as of May 7, 2017. Sixteen manuscripts are currently awaiting final verification. Since 2015, 14 manuscripts did not go through the replication process because they did not contain any empirical data.

[7] Of course, it is possible that there is a selection effect wherein scholars who object to the replication and verification policy do not submit their work to the AJPS.

[8] There has only been one case in which the failure to replicate results may lead to rescinding the acceptance of a manuscript. That verification process is currently in progress.

[9] Over the years, the AJPS has progressed through almost the full range of TOP options, from first encouraging voluntary data sharing, to requiring public access, and eventually verifying the data and code. We expect that other journals – particularly those that strive to be at the forefront of scientific research – will make similar progress over time across the range of openness and transparency options.

This article is part of a series on how scholars are addressing the “reproducibility crisis” by making research more transparent and rigorous. The series was produced by Footnote, an online media company that amplifies the impact of academic research and ideas by showcasing them to a broader audience, and Stephanie Wykstra, Ph.D., a freelance writer and consultant whose work focuses on data sharing and reproducibility. It was supported by the Laura and John Arnold Foundation.