DORA’s ultimate aim is not to accumulate signatures but to promote real change in research assessment. One of the keys to this is the development of robust and time-efficient ways of evaluating research and researchers that do not rely on journal impact factors. We are keen to collect and share resources that will empower individuals and organizations to introduce new policy.
Assessing Scientists for Hiring, Promotion, and tenure
Six principles for hiring, promotion, and tenure were developed at a one-day workshop in Washington DC in January 2017 to address incentives and rewards in research assessment. These principles were published as part of the perspective piece below.
Center for Open Science
The Center for Open Science has a collection of open science policies at universities and examples of job announcements that mention open science. The Open Science Framework (OSF) also has a project that archives job offers that require or suggest an open science statement from applicants.
The badges for our signatories show support for DORA, raise awareness about research assessment, and serve as a conversation starter with individuals or organizations that have not heard about DORA yet.
DORA slide presentations
We know that slide presentations are one of the primary means of academic communication. To help you talk about research assessment, we created three presentations that are available for download.
Option 1 is a single slide for DORA that can be added to the end of research talks or other presentations for a brief introduction to DORA and the importance of research assessment.
Option 2 is a short presentation that can be used when you have a few minutes available to introduce good practices in research assessment that have implemented by research institutes, funders, and publishers. The presentation also includes a call to action for individuals and organizations.
Option 3 is the longest presentation available and contains the most information about DORA and the value of responsible assessment. It describes DORA’s 18 recommendations, provides examples of good practice, and highlights DORA’s current activities that align with our roadmap.
INORMS – Research Evaluation Working Group
The International Network of Research Management Societies (INORMS) created a working group in 2018 to promote meaningful, responsible, and effective research evaluation practices. They are developing some briefing materials on responsible research evaluation specifically aimed at senior university leaders, as well as developing a way of rating University Ranking organizations on their approaches to evaluating institutions.
The Metrics Toolkit is an online resource that provides information about research metrics across scholarly disciplines to help educate individuals in the academic community. It gives a summary of how each metric is calculated, what it can be applied to, what it’s limitations are, and lists appropriate and inappropriate use cases.
The Leiden Manifesto for Research Metrics
The Leiden Manifesto provides ten principles for the appropriate use of metrics in research evaluation. These principles can be used to maintain accountability of both evaluators and the indicators they use in metrics-based research assessment.
Transparency in Author Contributions in Science (TACS)
The National Academy of Sciences in the United States created a webpage to track journals that are engaging in fair authorship practices. The page monitors criteria for authorship, responsibilities of the corresponding author, requirement for ORCID iDs, and adoption of the Contributor Roles Taxonomy (CRediT). A related white paper, which led to the creation of the TACs webpage, provides recommendations for research institutes, funders, and societies to increase transparency in author contributions.
UK Forum for Responsible Research Metrics
In 2014 the British government commissioned an independent expert group (supported by the former Higher Education Funding Council for England, now Research England) to critically examine the prospects for using metrics in the evaluation of research, notably in the Research Excellence Framework (REF), a nationwide exercise run every 6 years to assess the quality of research being done in UK universities. The resulting report, The Metric Tide, which was published in July 2015, provides an extensive review of the literature on peer review, the use of metrics and altmetrics, and a statistical analysis of the predictive power of various numerical indicators (including the JIF). The expert group concluded that, while some metrics may sometimes be a useful adjunct to peer review, they should always be used carefully and with due consideration of context. The report recommends institutions of higher education consider signing DORA or applying DORA principles for the responsible use of metrics.
The expert group recommended that a UK Forum for Responsible Research Metrics was established to provide the UK funding bodies advice on the use of metrics in the REF, to provide advocacy and leadership in the UK, and to establish links internationally. More information, including the Forum’s advice on REF 2021 and a UK progress report can be found here.