Resources

DORA’s ultimate aim is not to accumulate signatures but to promote real change in research assessment.  One of the keys to this is the development of robust and time-efficient ways of evaluating research and researchers that do not rely on journal impact factors. We are keen to collect and share resources that will empower individuals and organizations to introduce new policy.

ASAPBio

ASAPbio (Accelerating Science and Publication in Biology) tracks preprint policies and practices at journals, funders, and universities.

Assessing Scientists for Hiring, Promotion, and tenure

Six principles for hiring, promotion, and tenure were developed at a one-day workshop in Washington DC in January 2017 to address incentives and rewards in research assessment. These principles were published as part of the perspective piece below.

Moher D, Naudet F, Cristea IA, Miedema F, Ioannidis JPA, Goodman SN (2018). Assessing scientists for hiring, promotion, and tenurehttps://doi.org/10.1371/journal.pbio.2004089

Center for Open Science

The Center for Open Science has a collection of open science policies at universities and examples of job announcements that mention open science. The Open Science Framework (OSF) also has a project that archives job offers that require or suggest an open science statement from applicants.

DORA badges

The badges for our signatories show support for DORA, raise awareness about research assessment, and serve as a conversation starter with individuals or organizations that have not heard about DORA yet.

Download in colour >>
Download in black >>
Download in white >>

DORA slide presentations

We know that slide presentations are one of the primary means of academic communication. To help you talk about research assessment, we created three presentations that are available for download.

Option 1 is a single slide for DORA that can be added to the end of research talks or other presentations for a brief introduction to DORA and the importance of research assessment.

Single DORA slide

Option 2 is a short presentation that can be used when you have a few minutes available to introduce good practices in research assessment that have implemented by research institutes, funders, and publishers. The presentation also includes a call to action for individuals and organizations.

Short DORA presentation

Option 3 is the longest presentation available and contains the most information about DORA and the value of responsible assessment. It describes DORA’s 18 recommendations, provides examples of good practice, and highlights DORA’s current activities that align with our roadmap.

Full DORA presentation

Helsinki Initiative on Multilingualism in Scholarly Communication

The Helsinki Initiative has three tenets to recognize multilingualism in scholarly work. This includes the promotion of language diversity in research assessment, evaluation, and funding systems.

INORMS – Research Evaluation Working Group

The International Network of Research Management Societies (INORMS) created a working group in 2018 to promote meaningful, responsible, and effective research evaluation practices. They are developing some briefing materials on responsible research evaluation specifically aimed at senior university leaders, as well as developing a way of rating University Ranking organizations on their approaches to evaluating institutions.

Metrics Toolkit

The Metrics Toolkit is an online resource that provides information about research metrics across scholarly disciplines to help educate individuals in the academic community. It gives a summary of how each metric is calculated, what it can be applied to, what it’s limitations are, and lists appropriate and inappropriate use cases. 

Résumé for Researchers

The Résumé for Researchers is a tool developed by the Royal Society to help support the evaluation and assessment of individuals’ varied research contributions. The Résumé is organized into four modules:

  1. How have you contributed to the generation of knowledge?
  2. How have you contributed to the development of individuals?
  3. How have you contributed to the wider research community?
  4. How have you contributed to broader society?

It also includes space for a personal statement and other additions, where someone can mention career breaks, secondments, volunteering, and other relevant experience. DORA provided feedback and input on the project.

Strategies to improve equity in faculty hiring

Despite the increasing number of underrepresented minorities in trainee positions, the number of underrepresented faculty members in academic science remains low. In this piece, Needhi Bhalla outlines several proven strategies to improve equity in faculty hiring. In addition, these strategies  increase transparency and consistency of faculty searches, which builds trust in the process.

Bhalla, N. Strategies to improve equity in faculty hiring (2019). https://doi.org/10.1091/mbc.E19-08-0476

The Hong Kong Principles for Assessing Researchers: Fostering Research Integrity

Institutions and funders can use the Hong Kong Principles to reward and recognize scholars for behavior that contributes to trustworthy research. The principles were developed as part of the 6th World Conference on Research Integrity.

The Leiden Manifesto for Research Metrics

The Leiden Manifesto provides ten principles for the appropriate use of metrics in research evaluation. These principles can be used to maintain accountability of both evaluators and the indicators they use in metrics-based research assessment.

Transparency in Author Contributions in Science (TACS)

The National Academy of Sciences in the United States created a webpage to track journals that are engaging in fair authorship practices. The page monitors criteria for authorship, responsibilities of the corresponding author, requirement for ORCID iDs, and adoption of the Contributor Roles Taxonomy (CRediT). A related white paper, which led to the creation of the TACs webpage, provides recommendations for research institutes, funders, and societies to increase transparency in author contributions.

UK Forum for Responsible Research Metrics

In 2014 the British government commissioned an independent expert group (supported by the former Higher Education Funding Council for England, now Research England) to critically examine the prospects for using metrics in the evaluation of research, notably in the Research Excellence Framework (REF), a nationwide exercise run every 6 years to assess the quality of research being done in UK universities. The resulting report, The Metric Tide, which was published in July 2015, provides an extensive review of the literature on peer review, the use of metrics and altmetrics, and a statistical analysis of the predictive power of various numerical indicators (including the JIF). The expert group concluded that, while some metrics may sometimes be a useful adjunct to peer review, they should always be used carefully and with due consideration of context. The report recommends institutions of higher education consider signing DORA or applying DORA principles for the responsible use of metrics.

The expert group recommended that a UK Forum for Responsible Research Metrics was established to provide the UK funding bodies advice on the use of metrics in the REF, to provide advocacy and leadership in the UK, and to establish links internationally. More information, including the Forum’s advice on REF 2021 and a UK progress report can be found here.