Skip to content

When you choose to publish with PLOS, your research makes an impact. Make your work accessible to all, without restrictions, and accelerate scientific discovery with options like preprints and published peer review that make your work more Open.

PLOS BLOGS The Official PLOS Blog

Peer Review in Service of Open Science

 

When we think about Open Science, sharing a diverse array of research products—such as data, code, methods, reagents—immediately comes to mind. More fundamentally, Open Science is about the entire process of conducting and communicating science according to long-established norms. Openness is at the heart of scientific enterprise. Scientists adopt open practices to allow collaboration and critical scrutiny, so that knowledge can be validated and built upon for the common good. The publishing process should be a central element ensuring that these norms are maintained. 

We’ve considered how our submission and peer-review processes at PLOS can be improved upon to facilitate Open Science: we have made several changes over the past year and a half and we continue to explore new possibilities. 

 

BUILDING BLOCKS OF AN OPEN PROCESS

 

Start with a preprint!

A year and a half ago, we introduced a new step in the submission process: we ask authors if they have deposited a preprint and if they haven’t, we offer to do it on their behalf. We have partnered with bioRxiv to deposit preprints after they have been screened by our staff—to make sure they are in scope and there are no ethical concerns or sensitive information.

Being open about a manuscript when it is submitted rather than published can make researchers wary (as some authors indicate in our surveys) but it does offer many benefits. It creates opportunities for collaboration and it helps researchers show their work before it is published, which is particularly important for those needing to demonstrate productivity for a thesis, or a job or grant application. Releasing a preprint has been associated with more attention and citations for the peer-reviewed article. At the moment, close to 20% of manuscripts considered at PLOS have a preprint at submission (compared to 3% across PubMed) with some communities particularly embracing this aspect of Open Science (e.g., over 50% of PLOS Computational Biology submissions are preprinted). 

 

The ability to share a pre-print of an article via bioRxiv allows for other researchers to start building off your findings…without delay. Furthermore, the open-access nature of the bioRxiv pre-prints opens the peer-review process to the entire scientific community, thus facilitating the incorporation of constructive feedback into your work.” 

— Nathan Robinson, PLOS ONE Author

 

Publish peer reviews

Peer review occupies an important function in the culture of objective critique that characterizes Open Science. The back and forth between authors and reviewers helps to validate claims and improve their reporting—it is a service to authors and readers alike. Exposing what happens during peer review increases the value of this exchange for all. Volunteer reviewers, along with editors, provide an enormous service to the scientific community; publishing the peer review history (with a DOI for unambiguous citation) is a step towards having this activity recognized and peer review reports viewed as a first-rate academic output. 

In May of last year, we launched the option for authors to publish the full peer review history alongside their article. While we hope that the community will gradually move to more openness, our initial implementation allows reviewers to remain anonymous, and lets authors opt in at acceptance, when the peer review has been concluded. We are heartened by the response—within six months of launching the option, an average of 40% of authors opt in across all journals, with 60-70% opt-in rate in certain disciplines, and 55% of published peer review packages have at least one named reviewer. 

 

Open peer review helps build public trust in the process by removing the mystery of secret academic peer review, as well as showing the robustness of the method in action.” 

–Thomas Shafee, PLOS Computational Biology Author

 

CONTINUING TO RETHINK PEER REVIEW

For all its important functions, peer review remains imperfect and faces new challenges. It is increasingly difficult for only two to three reviewers to evaluate all aspects of complex interdisciplinary manuscripts.  Expanding the group of experts who provide feedback would likely be beneficial in many cases but is difficult in practice. In addition, the process is prone to delays and redundancy when a manuscript goes through multiple rounds of revision—either within a journal or more commonly through successive cycles of submission – review – revision – rejection at different journals. 

With these building blocks in place—publishing preprints and peer review history—we have started to  experiment with new peer review ideas. Through 2020 we will be tracking the progress of three new initiatives: 

 

1. From a peer review workflow that supports preprints to a preprint workflow that supports peer review

 

We are facilitating the deposition of preprints; can we close the loop and harness in the peer review process the attention that other experts are paying to the preprint?

We are asking this question on four PLOS journals now. We alert authors at submission that when  comments are posted on their preprint, we will bring them to the attention of the handling editor.  According to a bioRxiv survey, 37% of respondents received comments on their preprints via email. We hope that with the understanding that their comments may help and accelerate peer review, commentators will be more willing to make these comments public on bioRxiv. The comments do not need to be full blown reviews, but perhaps relevant examination of specific aspects—statistics, methodology, code, etc. As we surface a more diverse array of comments to the editors, and also point to comments on twitter, PREreview or preLights (two third-party sites providing reviews of preprints), we will examine the impact of this diversity on the experience of authors and editors. 

 

2. Journal-independent peer review to reduce reviewer burden

Five PLOS journals are participating in Review Commons, a new service by EMBO Press and ASAPbio that offers journal-independent peer review. With a round of reviews in hand, authors might be better equipped to select a journal, and all participating journals commit to using the service’s reviews to expedite publication decisions. Authors may also decide that their manuscript is sufficiently validated and use Review Commons’s service to post peer reviews on bioRxiv. We look forward to learning about new behaviors and understanding how peer review of preprints can be most effective—with or without a journal publication outcome. 

 

3. Peer reviewing the experimental plan

This week PLOS ONE announced they will be offering Registered Reports as an option for publishing hypothesis-testing studies. This article type, with two stages of peer review and subsequent publications, is particularly important to avoid publication bias. By segmenting peer review into two stages, the authors get early feedback on their initial research plan in order to craft a robust and reproducible study design before they begin their investigation, and they also get a guarantee that their results will be published in the journal so long as they adhere to the study protocol. Registered reports provide greater transparency in assessment criteria, minimize biases and help make the entire research process more open by revealing key stages of work, earlier.

 

OPEN QUESTIONS

 

These three initiatives examine different aspects of peer review. We hope to learn about the effectiveness of different models and attitudes towards them, in order to design improvements.

We are encouraged to be in good company in this experimentation—consider the twelve other community-based journals participating in Review Commons, the registry of initiatives Reimagine Review maintained by ASAPbio, the new initiatives from eLife, and the vision articulated by HHMI leadership about the future of publishing. 

An interesting question for all to consider remains how to express the substance of peer review in ways that help readers make decisions about what to read, to incorporate into their own research, and to inform research assessment. The current overreliance on journal name is problematic because in so many ways it is a blunt tool, inadequate for representing the richness of evaluation. 

There is much more we can learn about how to improve peer review and how we can distill the journal process and reapply it in valuable ways—and it is our duty as publishers to continue this exploration for the benefit of all. 

 

Back to top