DORA’s ultimate aim is not to accumulate signatures but to promote real change in research assessment. One of the keys to this is the development of robust and time-efficient ways of evaluating research and researchers that do not rely on journal impact factors. We are keen to collect and share resources that will empower individuals and organizations to introduce new policy.
Rethinking Research Assessment
DORA is developing a toolkit of resources to help academic institutions improve their policies and practices. So far, it includes two briefing documents that offer principles to guide institutional change and strategies to address the infrastructural implications of common cognitive biases to increase equity.
- Ideas for Action outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices.
Download (PDF): Rethinking Research Assessment: Ideas for Action
- Unintended Cognitive and Systems Biases identifies seven personal biases that can influence hiring, promotion, and tenure decisions. It also reveals four institutional and infrastructural implications of these biases and provides strategies to develop new institutional conditions that reduce bias.
Download (PDF): Rethinking Research Assessment: Unintended Cognitive and System Biases
Assessing Scientists for Hiring, Promotion, and tenure
Six principles for hiring, promotion, and tenure were developed at a one-day workshop in Washington DC in January 2017 to address incentives and rewards in research assessment. These principles were published as part of the perspective piece below.
Moher D, Naudet F, Cristea IA, Miedema F, Ioannidis JPA, Goodman SN (2018). Assessing scientists for hiring, promotion, and tenure. https://doi.org/10.1371/journal.pbio.2004089
Center for Open Science
The Center for Open Science has a collection of open science policies at universities and examples of job announcements that mention open science. The Open Science Framework (OSF) also has a project that archives job offers that require or suggest an open science statement from applicants.
The badges for our signatories show support for DORA, raise awareness about research assessment, and serve as a conversation starter with individuals or organizations that have not heard about DORA yet.
DORA slide presentations
We know that slide presentations are one of the primary means of academic communication. To help you talk about research assessment, we created three presentations that are available for download.
Option 1 is a single slide for DORA that can be added to the end of research talks or other presentations for a brief introduction to DORA and the importance of research assessment.
Option 2 is a short presentation that can be used when you have a few minutes available to introduce good practices in research assessment that have implemented by research institutes, funders, and publishers. The presentation also includes a call to action for individuals and organizations.
Option 3 is the longest presentation available and contains the most information about DORA and the value of responsible assessment. It describes DORA’s 18 recommendations, provides examples of good practice, and highlights DORA’s current activities that align with our roadmap.
Good Practice in Researcher Evaluation: Recommendation for the Responsible Evaluation of a Researcher in Finland
A working group set up by the Federation of Finnish Learned Societies produced guidelines to improve how researchers are assessed in Finland. The report provides a set of general principles (transparency, integrity, fairness, competence, and diversity) that apply throughout 13 recommended good practices to improve four aspects of researcher evaluation, including:
- Building the evaluation process
- Evaluation of research
- Diversity of activities
- Researcher’s role in the evaluation process.
The other participating organizations include Academy of Finland, Advisory Board on Research Ethics, Association of Finnish Foundations, Finnish Association of Research Managers and Advisors, Finnish Union of University Professors, Finnish Union of University Researchers and Teachers, National Library of Finland, Network of Sectoral research institutes, Rectors’ Conference of Finnish Universities of Applied Sciences, the joint library network, Universities Finland and Young Academy Finland.
Helsinki Initiative on Multilingualism in Scholarly Communication
The Helsinki Initiative has three tenets to recognize multilingualism in scholarly work. This includes the promotion of language diversity in research assessment, evaluation, and funding systems.
How Journals and Publishers Can Help to Reform Research Assessment
Editorial decisions influence research assessment. This article provides 10 concrete recommendations that journals and publishers can take to help improve how research is assessed.
- Cease the promotion of journal impact factors
- Provide article metrics and indicators
- Adopt the CRediT taxonomy for author contributions
- Ensure that all reference data deposited with Crossref is open
- Require authors to make all key data available according to FAIR principles
- Follow the data citation principles
- Encourage the use of unique identifiers (eg RRIDs)
- Require authors to use ORCIDs
- Publish peer review reports and author responses along with the article
- Examine ways to increase diversity, equity, and inclusion in the publishing process
Hatch A and Patterson M (2019). How journals and publishers can help to reform research assessment. https://www.csescienceeditor.org/article/how-journals-and-publishers-can-help-to-reform-research-assessment/
INORMS – Research Evaluation Working Group
The International Network of Research Management Societies (INORMS) created a working group in 2018 to promote meaningful, responsible, and effective research evaluation practices. They are developing some briefing materials on responsible research evaluation specifically aimed at senior university leaders, as well as developing a way of rating University Ranking organizations on their approaches to evaluating institutions.
The Metrics Toolkit is an online resource that provides information about research metrics across scholarly disciplines to help educate individuals in the academic community. It gives a summary of how each metric is calculated, what it can be applied to, what it’s limitations are, and lists appropriate and inappropriate use cases.
Résumé for Researchers
The Résumé for Researchers is a tool developed by the Royal Society to help support the evaluation and assessment of individuals’ varied research contributions. The Résumé is organized into four modules:
- How have you contributed to the generation of knowledge?
- How have you contributed to the development of individuals?
- How have you contributed to the wider research community?
- How have you contributed to broader society?
It also includes space for a personal statement and other additions, where someone can mention career breaks, secondments, volunteering, and other relevant experience. DORA provided feedback and input on the project.
Room for Everyone’s Talent
Dutch public knowledge institutions and research funders published a position paper ‘Room for Everyone’s Talent‘ rethinking their academic reward and recognition systems to:
- Enable the diversification and vitalization of career paths
- Acknowledge the independence and individual qualities and ambitions of academics as well as recognizing team performances
- Emphasize quality of work over quantitative results (such as number of publications)
- Encourage all aspects of open science
- Encourage high quality academic leadership
The paper outlines specific actions the organizations are taking, including training assessment committees, increasing emphasis on collaborative contributions, no longer requesting bibliometric indicators, and more. Universities involved will also create a committee to initiate a campus-wide discussion about the adoption of the new rewards and recognition system.
The participating organizations include the Association of Universities in the Netherlands (VSNU), Netherlands Federation of University Medical Centers (NFU), Royal Netherlands Academy of Arts and Sciences (KNAW), Dutch Research Council (NWO), and Netherlands Organization for Health Research and Development (ZonMw).
Strategies to improve equity in faculty hiring
Despite the increasing number of underrepresented minorities in trainee positions, the number of underrepresented faculty members in academic science remains low. In this piece, Needhi Bhalla outlines several proven strategies to improve equity in faculty hiring. In addition, these strategies increase transparency and consistency of faculty searches, which builds trust in the process.
Bhalla, N. Strategies to improve equity in faculty hiring (2019). https://doi.org/10.1091/mbc.E19-08-0476
The Hong Kong Principles for Assessing Researchers: Fostering Research Integrity
Institutions and funders can use the Hong Kong Principles to reward and recognize scholars for behavior that contributes to trustworthy research. The principles were developed as part of the 6th World Conference on Research Integrity.
The Leiden Manifesto for Research Metrics
The Leiden Manifesto provides ten principles for the appropriate use of metrics in research evaluation. These principles can be used to maintain accountability of both evaluators and the indicators they use in metrics-based research assessment.
Transparency in Author Contributions in Science (TACS)
The National Academy of Sciences in the United States created a webpage to track journals that are engaging in fair authorship practices. The page monitors criteria for authorship, responsibilities of the corresponding author, requirement for ORCID iDs, and adoption of the Contributor Roles Taxonomy (CRediT). A related white paper, which led to the creation of the TACs webpage, provides recommendations for research institutes, funders, and societies to increase transparency in author contributions.
UK Forum for Responsible Research Metrics
In 2014 the British government commissioned an independent expert group (supported by the former Higher Education Funding Council for England, now Research England) to critically examine the prospects for using metrics in the evaluation of research, notably in the Research Excellence Framework (REF), a nationwide exercise run every 6 years to assess the quality of research being done in UK universities. The resulting report, The Metric Tide, which was published in July 2015, provides an extensive review of the literature on peer review, the use of metrics and altmetrics, and a statistical analysis of the predictive power of various numerical indicators (including the JIF). The expert group concluded that, while some metrics may sometimes be a useful adjunct to peer review, they should always be used carefully and with due consideration of context. The report recommends institutions of higher education consider signing DORA or applying DORA principles for the responsible use of metrics.
The expert group recommended that a UK Forum for Responsible Research Metrics was established to provide the UK funding bodies advice on the use of metrics in the REF, to provide advocacy and leadership in the UK, and to establish links internationally. More information, including the Forum’s advice on REF 2021 and a UK progress report can be found here.