DORA community interviews provide an opportunity for supporters to discuss innovation in research assessment and to better understand how to initiate effective change in local communities. This time we are talking with the executive director of the Belmont Forum, Dr. Erica Key about recognizing data in research assessment. Please join our discussion at 11:00AM EDT on Wednesday, Aug. 28.
On Wednesday April 3, 2019, we hosted a #sfDORA community interview with Janet Halliwell to learn more about the 2017 report from the Federation of the Humanities and Social Sciences in Canada, Approaches to Assessing Impacts in the Humanities and Social Sciences. We also wanted to hear about a new project she is working on to improve assessment in the Humanities and Social Sciences (HSS) by creating a standardized vocabulary of terms related to research assessment.
DORA turns 6 years old this week. Or, as we like to say, this year DORA reached 14,000—that’s how many people have signed DORA, and they come from more than 100 countries! Each signature represents an individual committed to improving research assessment in their community, in their corner of the world. And 1,300 organizations in more than 75 countries, in signing DORA, have publicly committed to improving their practices in research evaluation and to encouraging positive change in research culture.
There certainly is not a magic bullet when it comes to comprehensive and efficient research assessment, whether in the humanities or STEM fields. Publisher prestige currently influences tenure assessments in the humanities, as do journal names in the life sciences.
DORA community interviews provide an opportunity for supporters to discuss innovation in research assessment and to better understand how to initiate effective change in local communities. This time we are talking with Janet Halliwell about the 2017 report from the Federation of Humanities and Social Sciences in Canada that looks at assessing impact in those disciplines.
Researchers should not be evaluated based on factors outside of their control. While hiring, promotion, and tenure decisions in academia are in principle based on the merit of scholarly contributions, implicit biases influence who we hire and who we promote. Even if they are unintentional, biases have consequences.At the AAAS meeting in Washington, DC, this February, we explored approaches to addressing bias and increasing the diversity of the academic workforce in the session, “Academic Research Assessment: Reducing Biases in Evaluation."
DORA community interviews provide an opportunity for supporters to discuss good practices in research assessment and to better understand how to initiate effective change in local communities. Our first interview of 2019 is with the Open Access Advisor at CLACSO, Dominique Babini, and the Executive Director of Redalyc, Arianna Becerril García.
As 2018 comes to a close, we reflect on the past year and the progress the community made to improve the way we evaluate research and researchers.
Implicit evaluation criteria can heavily influence researchers’ decisions. But as a global academic community, we can reconcile our priorities with how we evaluate researchers by modifying the standards we use in research evaluations.
To mark DORA’s fifth anniversary, we are celebrating with an interview series that focuses on implementing good practices in research assessment. We are pleased to announce our fourth interview is with Prof. Christopher Jackson, Equinor Professor of Basin Analysis, Department of Earth Science & Engineering at Imperial College. He will answer questions about how researchers can help change the culture of research assessment. Participants will have the opportunity to ask questions at the end of the interview.