Research assessment can be extremely useful as a tool to evaluate the quality and the impact of the activities needed to achieve the Sustainable Development Goals (SDGs). The SDGs, adopted in 2015 by the General Assembly of the United Nations, envision that by the year 2030 the world could be transformed.
Current discourse on research assessment places high emphasis on “impact.” However, there are many different concepts of impact, and many different concepts of how research achieves impact. The resulting ambiguity and confusion confound efforts to improve research assessment. To reliably assess research, we need clarity about what it is we want to assess.
Human-centered design is well-positioned to supplement the ongoing activity of sharing best practices and specific, successful examples of new research assessment strategies, contributing a deep understanding what matters to individuals and entities, and a perspective on realigning incentives, social norms, and points of leverage where we might redefine and reward what’s valued in the future.
Much of the emphasis of DORA’s initiatives has revolved around appropriate metrics and assessments. Equally important is designing mechanisms that employ those assessments at decision-making steps.
On most campuses, the coalition (faculty, research officers, administrators, librarians, and department chairs) needed to move the needle on research assessment reform has yet to come together. Librarians have every reason to take an active role in help making this happen.
DORA community interviews provide an opportunity for supporters to discuss innovation in research assessment and to better understand how to initiate effective change in local communities. This time we are talking with the executive director of the Belmont Forum, Dr. Erica Key about recognizing data in research assessment. Please join our discussion at 11:00AM EDT on Wednesday, Aug. 28.
In order to fully realise the benefits of open research, we must fundamentally change the way research is assessed. Wellcome is seeking to implement the DORA principles in our own funding processes and support research institutions in changing their own assessment practices.
The retention, promotion, and tenure (RPT) process is a critical part of a faculty member’s career, during which they are ostensibly evaluated on scholarship, teaching, and service. However, a faculty member’s funding and publication track records are typically weighted more heavily as indicators of productivity. As a result, flawed metrics of teaching and service persist.
On Wednesday April 3, 2019, we hosted a #sfDORA community interview with Janet Halliwell to learn more about the 2017 report from the Federation of the Humanities and Social Sciences in Canada, Approaches to Assessing Impacts in the Humanities and Social Sciences. We also wanted to hear about a new project she is working on to improve assessment in the Humanities and Social Sciences (HSS) by creating a standardized vocabulary of terms related to research assessment.
DORA turns 6 years old this week. Or, as we like to say, this year DORA reached 14,000—that’s how many people have signed DORA, and they come from more than 100 countries! Each signature represents an individual committed to improving research assessment in their community, in their corner of the world. And 1,300 organizations in more than 75 countries, in signing DORA, have publicly committed to improving their practices in research evaluation and to encouraging positive change in research culture.