Good Practices

Funders

DORA’s ultimate aim is not to accumulate signatures but to promote real change in research assessment.  One of the keys to this is the development of robust and time-efficient ways of evaluating research and researchers that do not rely on journal impact factors. We are keen to gather and share existing examples of good practice in research assessment, including approaches to funding and fellowships, hiring and promotion, and awarding prizes, that emphasize research itself and not where it is published. 

If you know of exemplary research assessment methods that could provide inspiration and ideas for research institutes, funders, journals, professional societies, or researchers, please contact DORA.

ANR, the French National Research Agency

Applicants understand how funding decisions are made, because the Agency openly shares a general summary of its decision-making process on their website written in English and French.  Research impact analysis is based on the principles outlined in DORA and the Leiden Manifesto. For the 2019 Generic Call for Proposals, an awareness and training program has been announced to educate program chairs and to encourage consideration of other outputs of research in addition to research articles. Gender parity, geography, and affiliation are all paid close attention to as the Scientific Panels are assembled for each thematic call. 

For the Agency’s main call for proposals, it breaks the review process into two stages, which rely on two types of independent experts, those engaged in a Panel review process, and those conducting independent reviews.  In the initial stage of review, pre-proposals reduce the administrative burden on the applicants and reviewers alike. Summaries of the Panel’s decisions are sent to the coordinators of each project. Before review of the full proposals in the second round, the Panels’ composition can be adjusted to account specifically for the scientific themes in the proposals. At least two external independent reviews are solicited for each proposal. After this, comments from the reviewers are sent to scientific coordinators, who have the opportunity to comment if they find inaccuracies in the reviews. The Panel members assigned to each proposal then review both the external reviews and coordinators’ comments, and present a summary at a plenary meeting. Decisions are made collectively at a plenary meeting. The Panel ranks the best projects, and also makes a supplementary list of projects eligible for funding in the case that co-funding becomes available or if budgets are reallocated.

The ANR has also made a public pledge to reduce gender bias in their selection procedures. They began with statistical analyses of gender representation in proposals submitted vs. those funded in 2014-2016, and then also analyzed the issue of unconscious gender bias in their own review panels. The results of a literature review and an internal review indicated that parity in Scientific Evaluation Panels is “not enough to reduce this bias.” To address the root causes of biased selection of grantees, they have set up a “training and awareness process for committee chairpersons” which focuses on the question of gender in selection bias and parity within committees and consortia.

Additional information:

Austrian Science Fund (FWF)

The FWF recognizes a diverse range of research outcomes and requires grant applicants to provide information on various research achievements. The FWF asks for a list of no more than ten of the most important published or accepted academic publications (journal articles, monographs, edited volumes, contributions to edited volumes, proceedings, etc.). Furthermore, the FWF asks for a list of up to ten of the most important scientific/scholarly research achievements – apart from academic publications – such as awards, conference papers, keynote speeches, important research projects, research data, software, codes, preprints, exhibitions, knowledge transfers, science communication, licenses, or patents (see Application guidelines for FWF projects, page 10). Applicants are explicitly advised that journal-based metrics like the impact factor should not be included in those lists.

The international reviewers are informed that the assessment of research proposals submitted to FWF should not be based on the actual age of the applicants but on individual circumstances related to the duration of their academic career and previous research achievements. International reviews based exclusively on journal metrics will not be considered in the funding decisions by FWF. 

Cancer Research UK (CRUK)

CRUK recognizes value from all outputs of research, including publications. CRUK has modified its grant application process to ask candidates to describe the significance and impact of 3-5 key research achievements, which can include preprints, training delivered, contribution to consortia, patents, and sharing of key datasets, software, novel assays and reagents, and research publications.

The expert reviewers who evaluate research grants for CRUK also receive guidance on the organization’s commitment to DORA and on how the charity has strengthened the way it assesses research to focus on the scientific contribution. This guidance is delivered in multiple ways including inductions for new committee members, panel briefings, written guidance or modified peer review comment forms.  Reviewers are asked to consider the value and impact of all research outputs when assessing an applicant’s record of outputs, their key research achievements and their suitability in delivering the proposed research. Career breaks are also taken into consideration and appropriate adjustments are made when considering the record and impact of outputs. Reviewers are asked to recognize that the content of a scientific paper and its influence in the field hold more significance than publication metrics or where it was published.

CRUK encourages its network of research institutions to adopt and embed DORA principles in hiring, tenure and promotion decisions. CRUK summarized its new approach to assessing research and researcher quality in this blog.

EMBO

The application process for EMBO Long-Term Fellowships emphasizes the most important outcomes and impact of the applicant’s work rather than where it is published and specifically states that journal impact factors should not be provided.

From “Helpful Notes for Applicants”
In the ‘achievements of PhD’ section please provide your own summary of what you consider the most important outcome(s) of your PhD work and the impact of your work on and beyond the respective scientific field.

In your publication list, you should indicate your three most important publications, i.e. the three primary research papers that in your view provided the most important and original contributions to scientific knowledge irrespective of journal name and impact factor. Do NOT add the journal impact factor. Citations to the article or other article level metrics with source may be listed, but are not essential.

From the FAQs
Are journal impact factors or journal name taken into account when evaluating applications?

Journal impact factors or name are not used in the evaluation and selection of applications. EMBO encourages evaluation of the quality of the scientific work and its impact on the field, rather than the Impact Factor of the journal in which it was published.

European Commission

Evaluation of Research Careers fully Acknowledging Open Science Practices, a report released by the European Commission in 2017, recognizes the emerging Open Science movement creates an opportunity to develop an evaluation system for hiring and promotion that is focused on the equal treatment of applicants. The report finds the Journal Impact Factor does not accurately describe all articles in a particular journal and ‘makes no sense’ for evaluation purposes. DORA is listed in the report as one of the main initiatives calling for change in the scientific community. Yet, the report also shows that some institutions have signed DORA without implementing its principles in their faculty hiring and promotion procedures. In a survey conducted in conjunction with the report, 14% of respondents from funding agencies signed DORA and 7% said they would not.

From Section 3.2 Beyond the Impact Factor

In terms of metrics, evaluation is mainly based on researchers’ prestige, which, very often, is inferred from the prestige of the journals in which researchers publish their works. The journals’ prestige is in turn based mainly (if not only) on the Journal Impact Factor (JIF). Several works demonstrate clearly the disruptive value of the JIF: the vast majority of authors are taking advantage of the citations gathered by a small minority. Due to the shape of the frequency distribution of the number of citations (an over-dispersed distribution, where a few articles have a very high number of citations, and the vast majority articles have a few or, even, zero) calculating an ‘average’ figure and attributing it to all articles makes no sense.

Open Research Funders Group (ORFG)

The ORFG released guidance for funders called, Incentivizing the sharing of research outputs through research assessment: a funder implementation blueprint. The group created the document to assist funders in encouraging researchers to maximize the impact of their work by openly sharing research outputs. The blueprint identifies three goals to be successful:

  1. change the perception that publication in high-impact journals is the only metric that counts;
  2. provide demonstrable evidence that, while journal articles are important, we value and reward all types of research outputs; and
  3. ensure that indicators like the venue of publication or journal impact factor are not used as surrogate measures of quality in researcher assessment.

To do this, the blueprint provides three steps with concrete actions for funders: 1) policy development and declarations, 2) implementation, and 3) engagement.  Template language for funders is included in the document to promote easy uptake.

U.S. National Institutes of Health

The U.S. National Institutes of Health has revised the format of the CV or “biosketch” in grant applications. The addition of a short section into the biosketch where applicants concisely describe their most significant scientific accomplishments may help discourage the grant reviewers from focusing on the journal in which previous research was published.

U.S. National Science Foundation

The U.S. National Science Foundation has modified its instructions to grant applicants to recognize that the outputs of scientific research include more than just publications, an idea endorsed by DORA. Instructions for preparation of the Biographical Sketch have been revised to rename the “Publications” section to “Products” and amend terminology and instructions accordingly. This change makes clear that products may include, but are not limited to, publications, data sets, software, patents, and copyrights.

Wellcome

As part of its open access 2020 policy, Wellcome-funded organizations are asked to publicly commit to the principle that when assessing research outputs – for example in hiring, promotion and tenure committees – they will consider the intrinsic merit of the work, not the title of the journal, its impact factor or the publisher.

Wellcome has developed guidance for members of its advisory panels, stressing that when assessing applicants’ CVs they should:

  • Focus on the content and quality of publications, rather than their number or the impact factors of the journals in which they were published;
  • Take into account the diverse range of possible research outputs. Outputs vary between disciplines, and may include not just research articles but also data, reagents, software, intellectual property and policy changes;
  • Be sensitive to legitimate delays in research publication, and personal factors (parental or other types of leave, part-time working and disability) that may have affected the applicant’s record of outputs.

Wellcome has also modified its grant application forms (example here) such that it no longer specifically asks for researchers to cite their research publications, but instead asks researchers to list their outputs which may include (but are not limited to):

  • Peer-reviewed publications and preprints
  • Datasets, software and research materials
  • Inventions, patents and commercial activity