Introduction to DORA: a short presentation at the Global Research Council’s virtual Responsible Research Assessment Conference

DORA chair, Prof. Stephen Curry made a short introduction to DORA for the Global Research Council conference on Responsible Research Assessment, which was held online over the week of 23-27 November 2020. He briefly explains the origins of DORA, the meaning of the declaration, and how DORA developed into an active initiative campaigning for the world-wide reform of research assessment.

Science Foundation Ireland takes an iterative approach to develop a narrative CV

The Dutch Research Council, Science Foundation Ireland, Swiss National Science Foundation, and UKRI are all experimenting with narrative CV formats as part of their evaluation of grant proposals. At DORA’s virtual funder discussion on September 23, 2020, Rochelle Fritch and Laura Mackey presented Science Foundation Ireland’s (SFI) ongoing work to develop a narrative CV format for its funding mechanisms.

The intersections between DORA, open scholarship, and equity

The San Francisco Declaration on Research Assessment (DORA), published in May 2013, does not mention the term ‘open scholarship.’ And yet DORA and open scholarship are becoming increasingly entwined. DORA’s ambition is to improve research evaluation practices but the practicalities of implementation make it impossible to separate the evaluation of research from questions about who and what research is for, who gets to be involved, and how it should best be carried out, all of which have to take account of the power dynamics that shape the scholarly landscape.

Academic research culture influences learned behaviors in graduate students

For the past eight years the DORA has advocated that research institutions reevaluate their research assessment practices for recruitment, promotion, and funding decisions. To inform the evaluation of scientific productivity, DORA encourages the use of explicit criteria beyond popular bibliometrics like the Journal Impact Factor (JIF) or H-index. These criteria include a range of output measures, such as the generation of new software and datasets, research impact on a field, transparency, training early-career researchers, and influence on policy.