February Debut: DORA Is Represented at Peer Review Meeting

Publishing is a significant part of a researcher’s career and essential for scientific success. However, biases built into peer review and research assessment create an uneven playing field. Too often important sections of the research community, including women, underrepresented minorities, and non-English speakers, are subject to biases in peer review. Not only is their work discounted, but they also have fewer opportunities to participate in peer review. ASAPbio, along with HHMI and Wellcome, hosted a meeting on peer review at HHMI headquarters in Chevy Chase, MD, February 7-9. The meeting covered a range of topics related to the peer review process, including researcher assessment.

Peer review plays a key role in hiring, promotion, and funding decisions. It is no secret that scientists face pressure to publish in top-tier journals, and such papers are widely viewed as a requirement for career advancement. Early-career researchers trying to launch their academic careers acutely feel the pressure associated with publishing in a certain subset of journals.

DORA Steering Committee Chair Professor Stephen Curry announced DORA’s vision for the future during the meeting Thursday night and unveiled its new website. The transformation of the declaration from a statement of intent to a facilitator of policy change has begun. Many universities, funding agencies, and professional societies have established policies to reduce bias in research assessment and improve evaluation. By curating and sharing these good practices with the scientific community, organizations will have models to improve their hiring, promotion, and funding policies.

On Friday morning, meeting participants attended a breakout session about peer review and DORA. The session dissected typical researcher evaluation processes and identified steps that are vulnerable to shortcuts and misuse. To review applications in an efficient manner, evaluators often rely on shortcuts to triage large volumes of applications. The Journal Impact Factor is perhaps the best-known such shortcut, but researcher pedigree, institutional reputation (or lack thereof), geographical location, and more can all bias the minds of evaluators when considering individuals or individual pieces of work. The participants agreed that structured narratives and biographical sketches emphasize research contributions and could be one way to reduce the dependence on shortcuts. It was also agreed that research assessment and evaluation should reflect the values of an organization and that these should include the quality and reliability of research output, but contributions to teaching, open science, peer review, and activities that impact society more broadly should be criteria for judging researchers.

The meeting coincided with a similar gathering in the UK on responsible metrics, the announcement of the UK Research Councils signing DORA, and the publication of a perspective piece written by Prof. Curry calling for robust, efficient, and bias-free assessment methods. Since the UK Research Councils announced their signatures, 17 additional organizations have signed the declaration, including the Open Access Scholarly Publishers Association, the Brazilian Institute of Information in Science and Technology, and the IRCCS Ospedale San Raffaele. In addition more than 200 individuals have added their names.

Notes from the peer review meeting and a recording of the live stream can be found on the ASAPbio website.