DORA’s ultimate aim is not to accumulate signatures but to promote real change in research assessment. One of the keys to this is the development of robust and time-efficient ways of evaluating research and researchers that do not rely on journal impact factors. We are keen to gather and share existing examples of good practice in research assessment, including approaches to funding and fellowships, hiring and promotion, and awarding prizes, that emphasize research itself and not where it is published.
If you know of exemplary research assessment methods that could provide inspiration and ideas for research institutes, funders, journals, professional societies, or researchers, please contact DORA.
CIRAD, the French Agricultural Research Centre for International Development
CIRAD developed the ImpresS method to assess the environmental and societal impact its research. ImpresS, which stands for Impact of research in the South, is based on two themes:
- ImpresS ex post – analyze past experience and interventions
- ImpresS ex ante – incorporate impact pathways when the intervention is being developed
The ImpresS method is used to guide the development of future projects as well as evaluate the impact of current ones. Special attention is paid to the different types of impact that a research intervention can have.
Ghent University published a vision statement for evaluating research based on eight principles agreed on by the University’s Board of Governors:
- The choice of an appropriate evaluation method for research is in line with the objective of the evaluation.
- The evaluation takes into account the intended impact of the research; strictly academic, economic, societal, or a combination of these.
- The evaluation takes into account the diversity between disciplines.
- For each chosen evaluation method, the simplicity of the procedure is weighed up against the complexity of the research.
- The evaluation criteria are drawn up and communicated to all stakeholders in advance.
- There are sufficient experts on the evaluation committee who are in a position to adequately assess the quality of the research.
- The above principles are implemented by means of a smart choice of evaluation indicators and by adopting a holistic approach to peer review.
- Any committee or policy measure evaluating research, makes a best effort commitment to translate the above principles into practice.
In addition to these principles, the university made three major changes to faculty evaluation procedures. First, faculty members are now evaluated every five years, instead of the traditional two or four year time period. Second, the university is using committees as a career coaching tool for faculty members. Third, reports coming from faculty members themselves will be used for evaluation. Not output metrics.
Howard Hughes Medical Institute (HHMI)
In keeping with HHMI’s commitment to basic scientific discovery, the Institute employs outstanding researchers for renewable seven-year terms as HHMI Investigators. HHMI expects its Investigators to be talented and productive scientists who, in HHMI’s judgment, demonstrate a combination of specific attributes that clearly distinguish them from other highly competent researchers in the field. Those attributes and a brief description of key aspects of HHMI’s selection and renewal process are available online. A major component of both the selection and review process is the consideration of research findings reported in the five most significant articles from the most recent 5- to 7-year period, which are selected by the applicant or current Investigator. More details can be found here.
The University of St Andrews
The Open Research Working Group on behalf of the Research Committee developed a set of five principles to guide the use of bibliometric indicators in research assessment, which include expertise, diversity, data, integrity, and transparency. A new working group has been established to implement these principles.
University of California, Berkeley
The Office for Faculty Equity & Welfare at the University of California, Berkeley provides support for faculty search committees to evaluate candidate contributions to advancing diversity, equity, and inclusion (DEI), including guidance for search committee composition, application requirements, rating criteria, and more. The office also provides support for faculty candidates explaining how contributions to advance equity and inclusion will be assessed, providing examples that can be used to demonstrate contributions to DEI, and informing candidates how they might be asked to incorporate information about advancing DEI into a campus interview.
University of California, Berkeley
Department of Molecular and Cell Biology & Helen Wills Neuroscience Institute
Applications for assistant professor positions were designed to highlight the significance of an applicant’s accomplishments rather than default to using journal-based metrics as a substitute for research quality. The advertisement asked applicants to summarize their major research accomplishments, ongoing and planned research program, and contributions to diversity. Applicants were also asked to select three significant articles from their list of publications and describe the impact of each.
University of California, Irvine
The University of California, Irvine developed guidance “Identifying Faculty Contributions to Collaborative Scholarship” to help faculty describe and assess team science. It includes a non-exhaustive list of collaborative contributions for faculty to recognize in reflective research statements for merit or promotion review.
University College London
University College London (UCL) released its Academic Careers Framework, which provides information related to its promotion processes. In it, the framework recognizes that UCL is a signatory of DORA on page five and as such states that it rejects the use journal-based metrics to quantify the quality of research in question.
University of Colorado School of Medicine
The Faculty Promotions Committee at the School of Medicine advises candidates against using journal-based metrics in their promotion or tenure dossiers, as these metrics do not accurately capture the significance of specific research contributions. Instead, the emphasis is on writing a narrative statement that clearly conveys the significance of the work. While metrics like the H index or total citation counts can be used to help evaluate the impact, they cannot serve as a replacement for a clear description of the work’s value. In addition, the committee recognizes value from all outputs and outcomes generated by research.
My career focuses on scientific research. How should I document my research accomplishments in my promotion or tenure dossier?
The Faculty Promotions Committee discourages the use of journal-based metrics (i.e., journal impact factors), since it is the quality and importance of the research contribution itself that is the key. Research importance can also be measured by its impact on policy, practice or the scientific discipline. Other outputs from scientific research, such as intellectual property, databases, software or others, may also be highlighted.
University Medical Center Utrecht
In order to create a culture around research assessment that is free from metrics including Journal Impact Factor, the university held a series of meetings to facilitate discussions with researchers on the best ways to create change. Representatives from multiple career-stages were invited to discuss policies to define and measure societal impact as well as research excellence. Central to the success of the policy was the inclusion of researchers and faculty in these conversations, because their approval signified an agreement to be judged by the criteria. The university now favors the use of bio-sketches, where scientists summarize the impacts of their contributions.
Universitat Oberta de Catalunya
The Universitat Oberta de Catalunya (UOC) signed DORA in December 2018 as a fundamental element of its Open Knowledge Action Plan. The Plan recalls the important role that knowledge and institutions of higher education play in contributing to a sustainable and fair future. Recognizing the important challenges societies are now faced with and which are included in the Agenda 2030 for Sustainable Development and the Sustainable Development Goals as well as the obstacles related to the free circulation of knowledge, the UOC is committed to taking action as a knowledge hub. Reflecting on current research assessment practices and introducing changes in how evaluation of research is conducted at the UOC play an important role in the Action Plan.
In preparation of signing DORA, the UOC assembled an interdisciplinary task force for the transformation of research assessment at the university, which following a participatory process, produced a report on the changes needed to conform to the DORA principles. The UOC is currently promoting an institutional wide reflection and discussion to generate consensus on changes towards DORA in institutional evaluation practices.
UT Southwestern Medical Center
Department of Cell Biology
The cover letter plays a significant role in the initial assessment of candidates for assistant faculty positions in the Department of Cell Biology at UT Southwestern Medical Center and is used to reduce some of the biases associated with CVs, such as research pedigree or whether candidates have published in brand name journals. In the cover letter, applicants describe the significance of their work and what they envision for their research program. The department then uses short Skype interviews as an intermediate stage and screens as many as 30 individuals. Applicants are asked to prepare answers to two key questions for the Skype interview: 1) Where will your research program be in five years? and 2) How can UT Southwestern help you get there? Having the questions beforehand allows the department to identify more thoughtful candidates in addition to those who can think quickly.