Prioritizing data sharing to support reproducibility and replicability: excerpt from new Research Integrity Toolkit

Within and outside academia, research reproducibility and replicability have become hot-button issues. Examples of failed attempts to duplicate published research conclusions brought forward in recent years have sparked increased scrutiny of the data and methods underpinning scholarly reports and spotlighted the uncomfortable truth that we are amid an ongoing reproducibility and replication crisis. This concerning reality spans research disciplines and affects all stakeholders.

What’s needed to change course, and how can peer review be part of the answer? 

This excerpt from Scholastica and Research Square’s new “Research Integrity Toolkit” blog series discusses data transparency as part of the solution and steps journals can take to promote more widespread data sharing.

Reproducibility, replicability, and the role of data transparency

While reproducibility and replicability are closely linked, it’s worth acknowledging the nuances in their definitions. Reproducibility generally refers to the ability to produce the same findings as a published report using the same methods, whereas replicability refers to the ability to reach the same findings as a published report using different methods.

Non-transparent reporting is one of the primary reasons it can be so difficult to reproduce and replicate published research findings. Possibly the most essential way for journals to increase the reproducibility and replicability of the material they publish is to take proactive measures to maximize data availability and transparency. This can have tangible and far-reaching benefits, including the potential for increased citability of articles when all community members can freely assess the quality of their data and methodology. Moreover, data sharing can boost data reuse, expanding the scope for generating new insights.

Prioritizing data sharing and transparency

Establishing solid data-sharing policies in line with the needs of journal contributors and readers is a vital initial step toward promoting data transparency. These can range from asking researchers to agree to openly share their source data and methods if and when requested to requiring them to submit those resources as a standard part of journal publishing agreements.

Journals can also champion open and comprehensive sharing not just of the raw data underlying research studies but also the details of the procedures and equipment used to generate them, the data-collection methods employed, the statistical techniques applied for analysis, and the peer-review process. PLOS is a helpful case study for incorporating open methods into research findings. They provide scholars with four submission options, including Registered Reports, study protocols, and lab protocols, as well as traditional research articles.

There are also various data and methods transparency initiatives developed in recent years that journals may want to consider adopting, including:

  • FAIR Data Principles: Many publishers are beginning to implement FAIR, a set of guiding “principles for scientific data management and stewardship” first published in Scientific Data in 2016. The principles are widely promoted by GO FAIR, a “stakeholder-driven and self-governed initiative that aims to implement the FAIR data principles.” FAIR stands for making data Findable, Accessible, Interoperable, and Reusable. For further discussion of strategies to achieve this, check out Scholastica’s blog post “3 Ways scholarly journals can promote FAIR data.”
  • Open Science Badges: The Center for Open Science (COS) also features an Open Science Badge scheme journals can adopt to encourage authors to share their research data and methods. There are badges to acknowledge Registered Reports and reports with open data and/or materials. According to COS, “Implementing badges is associated with [an] increasing rate of data sharing (Kidwell et al, 2016), as seeing colleagues practice open science signals that new community norms have arrived.”
  • TOP Guidelines: Another initiative from COS to improve the robustness of research reporting is the Transparency and Openness Promotion (TOP) guidelines, which various publishers now endorse, including the American Psychological Association (APA). The TOP guidelines comprise eight transparent reporting standard areas (e.g., citation standards and data transparency) with three possible compliance levels increasing in stringency. Accompanying the TOP guidelines is TOP Factor, a metric that reports the steps a journal is taking to implement open science practices. COS aims for TOP Factor to be considered alongside other publication reputation and impact indicators, such as the Journal Impact Factor (JIF).

Putting it all together

There are myriad reasons why it’s critical for research to be reproducible and replicable, particularly for publishers, editors, authors, and readers of scholarly journals. Of course, the most fundamental is to avoid the dissemination and perpetuation of misleading results produced either through unintentional errors or, in some cases, by intentional dishonesty. Others include increasing confidence in research, encouraging collaboration, optimizing time and resources, facilitating diversity and inclusion, and accelerating discovery. You can read a collection of articles on these topics in the Harvard Data Science Review (HDSR) Reproducibility and Replicability special issue.

The jury is out on how well the community has tackled these issues up to this point. But it’s clear there’s still work ahead to ensure the integrity of research data and methods and foster greater trust in scholarship.

As noted, this blog post is an excerpt from Scholastica and Research Square’s “Research Integrity Toolkit” series in honor of Peer Review Week. You can read the full post on steps journals can take to promote reproducibility and replicability on Scholastica’s blog here. We’ve also released a post on associated best practices for submitting authors on the Research Square blog here.


Attribution: This post was written by Victoria Kitchener, freelance scholarly publishing writer, and edited by Danielle Padula, head of marketing and community development at Scholastica, to create this excerpt

About the authors:

Victoria Kitchener

Victoria Kitchener is a freelance science editor and writer who frequently contributes to the Scholastica blog. Prior to starting a freelance career, she spent 20 years working as a science editor and writer for Springer Nature Group.

Danielle Padula

Danielle heads up marketing and community development at Scholastica, a scholarly publishing technology provider with peer review, production, and OA journal hosting solutions. Prior to joining Scholastica, Danielle worked in academic book publishing. She enjoys creating resources to help publishers navigate the evolving landscape and is excited to be co-chairing PRW 2022.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s