2021 in Review: List of new developments in research assessment

Each year, DORA reflects on progress toward responsible research assessment. How is research valued in different communities and how might that have changed in 2021? What tools are the community creating to support policy development? What types of research assessment policies are being developed to reduce the influence of journal-based metrics and recognize a broad range of contributions? How are communities coming together to improve practice and support culture change?

The following list of new developments and recommended reading, viewing, and listening were created with input from the DORA Steering Committee. While the search was extensive, it was not exhaustive, and we might have missed something in the process. Please let us know what other advances we should consider adding to the resource library (email info@sfdora.org).

New Developments


  • This summer, DORA was awarded a 3-year, $1.2M grant from Arcadia – a charitable fund of Lisbet Rausing and Peter Baldwin – to support Tools to Advance Research Assessment (TARA), a project to facilitate the development of new policies and practices for academic career assessment. The grant provides DORA with crucial support to create an interactive online dashboard, a toolkit of resources, and a survey of U.S. academic institutions.

    The dashboard aims to capture progress in responsible research assessment and provide counter-mapping to common proxy measures of success (e.g., Journal Impact Factor (JIF), H-index, and university rankings). Two community calls in the Fall helped to shape the vision of the dashboard by identifying contexts for use and providing early stage input about its functionality and relevant types of source material.
  • DORA released SPACE, a tool to facilitate academic assessment reform at universities by examining what infrastructure is needed to support the development of new policies and practices. A commentary outlining key learnings from the design process and the initial piloting phase was published in September. More than 70 individuals in 26 countries and 6 continents informed the development of SPACE!

  • In collaboration with the Luxembourg National Research Fund, DORA released a new video: Balanced, broad, responsible: A practical guide for research evaluators. This short film is accompanied by a one-page brief that research funders can use to promote a more holistic approach to the evaluation of funding proposals.

  • DORA organized a workshop for research funders in September with the Funding Organizations for Gender Equality Community of Practice (FORGEN CoP) focused on optimizing the use of narrative CVs for grant evaluation and mitigating bias. More than 120 participants from 22 countries and 40 funding organizations participated. As a next step, DORA is organizing a workshop for funders in February of 2022 with the FORGEN CoP, Science Foundation Ireland, Swiss National Science Foundation, and UK Research and Innovation to explore a shared approach to monitoring the effectiveness of narrative CVs for grant evaluation.

  • Reimagining Academic Career Assessment: Stories of innovation and change, a collection of case studies from universities and national consortia examining their work to develop new policies and practices, was published at the end of 2020 by DORA, the European University Association, and SPARC Europe. In January, the three organizations published a policy paper analyzing the initial set of case studies. Three new case studies were added by DORA in 2021: the Latin American Forum for Research Assessment (FOLEC), European Molecular Biology Laboratory (EMBL), and the Open University in the United Kingdom. Community members joined us for webinars in February and December to hear from case study organizations and learn about specific elements of their change processes.

From elsewhere

  • Funding organizations continue to adopt and optimize the use of narrative CV formats for grant evaluation to recognize a broader range of research achievements. For example:
    • The Luxembourg National Research Fund introduced a narrative CV format in January for all funding programs where a CV is requested.
    • In July, seven research funders in the UK made a public commitment to explore a shared approach to adoption of the Résumé for Researchers, a narrative CV format developed by the Royal Society.
    • The Health Research Board in Ireland is iteratively improving the use of narrative CVs through systematic feedback from applicants and evaluators.
    • Ten recommendations to improve the content and structure of academic CVs were inspired by the CV Harmonization Group (H-group) and published in October.

  • Indiana University–Purdue University Indianapolis (IUPUI) created a path to promotion and tenure recognizing contributions to diversity, equity, and inclusion, showing how research assessment can encapsulate an organization’s values.

  • The Latin American Forum for Research Assessment (FOLEC) released new tools to support collective action for responsible research assessment on the continent by making diverse forms of knowledge production visible and promoting new modalities of evaluation, as well as encouraging bibliodiversity and defending multilingualism in academia.

  • The Global Research Council established an international working group on responsible research assessment in September to provide guidance and support in embedding new policies and practices for its member organizations.

  • The Dutch Research Council (NWO) released a toolkit on inclusive assessment for reviewers and committee members that provides information about implicit bias and offers practical suggestions, based on scientific research, to optimize the evaluation process with the goal of expanding “the often limited ideal image of what a good researcher or a good proposal is.” In November, NWO published four interviews with successful grant applicants that prominently featured non-traditional research outputs on their applications, sending an important message to early career researchers that papers published in high impact factor journals are not required to succeed in academia. This is a powerful strategy that funding organizations can adopt to signal they are serious about not taking into account journal-based metrics and about recognizing broader contributions to science and society.

  • In 2021, China issued guidelines to improve the evaluation system of scientific and technological achievements and accelerate their transformation into productive forces for the community. More than 40 universities and research institutes, hospitals, societies, industries and other representative institutions have been selected to pilot new research assessment practices.

  • The UNESCO Recommendation on Open Science was adopted by 193 countries in November. As part of the recommendation, Member States are asked to align incentives for Open Science by reviewing evaluation and assessment systems and developing new ones that focus on the quality of research outputs rather than quantity and that value all relevant research activities and outputs.

  • Also in November, the European Commission announced a plan to create a coalition of research funding and research performing organizations committed to implementing changes in research assessment practices.
  • A 2021 report from the African Center for Economic Transformation (ACET) advises policymakers on the importance of rewarding regional collaboration in research, such as the Songhaï Center in Benin, which “conducts training, production, & research, combining modern & traditional methods.”
  • In April, the Linguistic Society of America published recommendations for the fair review of Open Scholarship in hiring, tenure, promotion, and awards, demonstrating how scholarly societies can encourage and promote responsible research assessment.
  • Universities Norway published NOR-CAM – A toolbox for recognition and rewards in academic careers to make the assessment processes more transparent and predictable, both for individuals and institutions in the country.
  • The Latin American Observatory of Research Assessment Indicators (OLIVA) is a collaborative, regional, and interdisciplinary project to create a common framework of indicators to examine the production and circulation of knowledge in Latin America. In 2021, OLIVA continued developing a database for materials from articles in journals that are indexed in Scielo and Redalyc with an aim to identify relevant research assessment indicators for the region.

Recommended reading, viewing, and listening

Share This

Copy Link to Clipboard