Good Practices


DORA’s ultimate aim is not to accumulate signatures but to promote real change in research assessment.  One of the keys to this is the development of robust and time-efficient ways of evaluating research and researchers that do not rely on journal impact factors. We are keen to gather and share existing examples of good practice in research assessment, including approaches to funding and fellowships, hiring and promotion, and awarding prizes, that emphasize research itself and not where it is published. 

If you know of exemplary research assessment methods that could provide inspiration and ideas for research institutes, funders, journals, professional societies, or researchers, please contact DORA.

EMBO Long-term Fellowships

The application process for EMBO Long-Term Fellowships emphasizes the most important outcomes and impact of the applicant’s work rather than where it is published and specifically states that journal impact factors should not be provided.

From “Helpful Notes for Applicants”
In the ‘achievements of PhD’ section please provide your own summary of what you consider the most important outcome(s) of your PhD work and the impact of your work on and beyond the respective scientific field.

In your publication list, you should indicate your three most important publications, i.e. the three primary research papers that in your view provided the most important and original contributions to scientific knowledge irrespective of journal name and impact factor. Do NOT add the journal impact factor. Citations to the article or other article level metrics with source may be listed, but are not essential.

From the FAQs
Are journal impact factors or journal name taken into account when evaluating applications?

Journal impact factors or name are not used in the evaluation and selection of applications. EMBO encourages evaluation of the quality of the scientific work and its impact on the field, rather than the Impact Factor of the journal in which it was published.

European Commission

Evaluation of Research Careers fully Acknowledging Open Science Practices, a report released by the European Commission in 2017, recognizes the emerging Open Science movement creates an opportunity to develop an evaluation system for hiring and promotion that is focused on the equal treatment of applicants. The report finds the Journal Impact Factor does not accurately describe all articles in a particular journal and ‘makes no sense’ for evaluation purposes. DORA is listed in the report as one of the main initiatives calling for change in the scientific community. Yet, the report also shows that some institutions have signed DORA without implementing its principles in their faculty hiring and promotion procedures. In a survey conducted in conjunction with the report, 14% of respondents from funding agencies signed DORA and 7% said they would not.

From Section 3.2 Beyond the Impact Factor

In terms of metrics, evaluation is mainly based on researchers’ prestige, which, very often, is inferred from the prestige of the journals in which researchers publish their works. The journals’ prestige is in turn based mainly (if not only) on the Journal Impact Factor (JIF). Several works demonstrate clearly the disruptive value of the JIF: the vast majority of authors are taking advantage of the citations gathered by a small minority. Due to the shape of the frequency distribution of the number of citations (an over-dispersed distribution, where a few articles have a very high number of citations, and the vast majority articles have a few or, even, zero) calculating an ‘average’ figure and attributing it to all articles makes no sense.

Higher Education Funding Council for England

In 2014 the British government commissioned an independent expert group (supported by the Higher Education Funding Council for England (HEFCE)) to critically examine the prospects for using metrics in the Research Excellence Framework (REF), a nationwide exercise run  every 6 years to assess the quality of research being done in UK universities. The resulting report, The Metric Tide, which was published in July 2015, provides an extensive review of the literature on peer review, the use of metrics and altmetrics, and a statistical analysis of the predictive power of various numerical indicators (including the JIF). The expert group concluded that, while some metrics may sometimes be a useful adjunct to peer review, they should always be used carefully and with due consideration of context. The report recommends institutions of higher education consider signing DORA or applying DORA principles for the responsible use of metrics.

U.S. National Institutes of Health

The U.S. National Institutes of Health has revised the format of the CV or “biosketch” in grant applications. The addition of a short section into the biosketch where applicants concisely describe their most significant scientific accomplishments may help discourage the grant reviewers from focusing on the journal in which previous research was published.

U.S. National Science Foundation

The U.S. National Science Foundation has modified its instructions to grant applicants to recognize that the outputs of scientific research include more than just publications, an idea endorsed by DORA. Instructions for preparation of the Biographical Sketch have been revised to rename the “Publications” section to “Products” and amend terminology and instructions accordingly. This change makes clear that products may include, but are not limited to, publications, data sets, software, patents, and copyrights.


The Wellcome has developed guidance for members of its advisory panels, stressing that when assessing applicants’ CVs they should:

  • Focus on the content and quality of publications, rather than their number or the impact factors of the journals in which they were published;
  • Take into account the diverse range of possible research outputs. Outputs vary between disciplines, and may include not just research articles but also data, reagents, software, intellectual property and policy changes;
  • Be sensitive to legitimate delays in research publication, and personal factors (parental or other types of leave, part-time working and disability) that may have affected the applicant’s record of outputs.

Wellcome has also modified its grant application forms (example here) such that it no longer specifically asks for researchers to cite their research publications, but instead asks researchers to list their outputs which may include (but are not limited to):

  • Peer-reviewed publications and preprints
  • Datasets, software and research materials
  • Inventions, patents and commercial activity