Guest post by Jason Chin, Ludo Waltman, and Kathryn Zeiler. All authors are editors of MetaROR.
Research assessment reform movements like DORA urge us to focus on what really matters: the intrinsic quality of research, a broad recognition of diverse contributions, and a move away from reductionist metrics like journal impact factors. These principles are gaining steam, but putting them into practice is far from trivial. Legacy publishing models in particular represent a major obstacle to the envisioned reforms.
MetaResearch Open Review (MetaROR), and other publish-review-curate platforms (e.g., Copernicus, eLife, F1000, Open Research Europe, Peer Community In) challenge this state of play. By supporting open, transparent, and more efficient approaches to scientific publishing, based on preprinting, open peer review, and new forms of editorial curation, these platforms offer the infrastructure needed to actually do what DORA has long recommended.
How does MetaROR work?
The publish-review-curate model seeks to upend the scientific publishing hegemony by decoupling publishing from assessment. Last year, we and our colleagues created the MetaROR platform to serve the burgeoning research-on-research ecosystem. We seek to provide a free, transparent way to assess and disseminate research in our field.
In short, authors post their research as a preprint (the ‘publish’ stage). They then share the link to their preprint in MetaROR’s submission system. If the preprint is within scope, a MetaROR editor arranges peer review (the ‘review’ stage). The reviews are then published on the platform along with a short editorial assessment (the ‘curate’ stage). If the authors wish, they can revise and resubmit. Recognising the importance traditional assessment methods still have for many authors, we are happy for authors to take MetaROR’s review materials to more traditional journals. In fact, we are creating arrangements with these journals that encourage the reuse of MetaROR’s reviews and assessments as a matter of course.
Importantly, MetaROR is community owned. It is led by two research-on-research organisations (the Association for Interdisciplinary Meta-Research and Open Science and the Research on Research Institute) and designed to serve the needs of the field. This includes the promotion of robust, collegial discussion and the rapid dissemination of research so that it can achieve its full impact.
How does MetaROR change the landscape of scientific publishing?
In line with a number of DORA’s recommendations, MetaROR offers various innovations to scientific communication. First, by allowing and encouraging non-anonymous, open reviews, reviewers have the option to publish citable original findings in their reports. A report co-authored by Raphaël Lévy, Maha Said, and Frédérique Bordignon, for example, reports the findings of an empirical test of an unsupported claim in the article they reviewed. The reviewers provided the details related to the methods they used to collect the data so that interested parties can attempt to reproduce the dataset and the results. They also used the CRediT system to clarify individual contributions and provided a reference list. This review is now citable and the reviewers can list it as an original contribution to the literature on their CVs. MetaROR enables reviewers to openly contribute to science and receive credit for it – it’s about time!
A second innovation is MetaROR’s on-going experiment related to the review of blog posts that contain original contributions to science. Traditionally, peer review has been limited to books and book chapters, journal articles, papers in conference proceedings, and the like. Researchers and others have used blogs primarily to communicate opinions and other forms of commentary. MetaROR sees no reason why researchers should not also use blog posts to communicate original scientific contributions. MetaROR is currently running an experiment in which we invite the submission of blog posts and handle them in the same way as traditional forms of scientific communication. Our first blog post submission was reviewed and subsequently revised based on the outcomes of the review process. The author documented on the same blog his positive experience with MetaROR’s publish-review-curation model.
MetaROR has also instituted innovative methods for increasing the efficiency of the review system. All editors know that finding willing researchers to spend precious time reviewing scientific work can be challenging. The traditional publication system is highly inefficient, asking reviewers to review each time a paper is submitted. If an author submits to five journals before receiving a publication offer, and journals collect reviews from two to three reviewers on average, a single paper might be reviewed by as many as 15 different researchers. Even worse, once the accept/reject decision is issued, reviews are buried and thus play no role in future decisions and evaluations involving the article. To reduce such inefficiencies, MetaROR publishes all reviews. In addition, MetaROR encourages traditional journals to reuse the reviews. The recent reuse of MetaROR reviews by Prometheus, a diamond open access journal, is an example. The author of a paper reviewed and curated by MetaROR submitted a revised version of the paper to Prometheus. The Prometheus editor reused the MetaROR reviews to evaluate the article and sent them to an additional reviewer of the editor’s choice. The editor was able to move forward with the assessment by tapping just one reviewer instead of two or three.
How does MetaROR promote reform in research assessment?
DORA’s main message is that research should be assessed “on its own merits rather than on the basis of the journal in which the research is published”, a message that is echoed by other influential reform initiatives such as the Coalition for Advancing Research Assessment (CoARA). While the principle of assessing research on its own merits is widely supported, its practical implementation remains challenging.
We believe that publish-review-curate platforms such as MetaROR are crucial to realising the ambition of assessing research on its own merits. In the traditional journal publishing model, the complex process of evaluating the strengths and weaknesses of a research article is reduced to a binary outcome: An article is either accepted or rejected for publication in a journal. No further information is made available about the article’s strengths and weaknesses. This reductionistic approach creates a strong push for evaluators to assess research based on the journal in which it is published, violating the philosophy of DORA.
The editorial assessments provided by MetaROR, and the underlying openly available peer reviews, offer a more in-depth and more nuanced perspective on the strengths and weaknesses of a scientific contribution. We expect this to be of great help for evaluators who wish to adopt the DORA philosophy of assessing research on its own merits.
Importantly, the publish-review-curate model can be applied not only to research articles but also to other outputs, as demonstrated by MetaROR’s experiment with peer review of blog posts. In this way, the publish-review-curate model can be used to implement DORA’s recommendation to “consider the value and impact of all research outputs … in addition to research publications”.
As argued elsewhere, reforming research assessment requires new approaches to scientific publishing. We invite all signatories and supporters of DORA to join us in promoting the adoption of innovative new publishing models!
Kathryn Zeiler and Ludo Waltman from MetaROR and Ginny Barbour and Rebecca Lawrence from DORA meet at Metascience 2025.