Search and Filter
Category
Published Date
August 1, 2024
The following post originally appeared on the Templeton World Charity Foundation blog. It is reposted here with their permission. Emerging policies to better recognize preprints and open scholarship Research funding organizations play an important role in setting the tone for what is valued in research assessment through the projects they fund and the outputs they…
December 1, 2020
DORA chair, Prof. Stephen Curry made a short introduction to DORA for the Global Research Council conference on Responsible Research Assessment, which was held online over the week of 23-27 November 2020. He briefly explains the origins of DORA, the meaning of the declaration, and how DORA developed into an active initiative campaigning for the world-wide reform of research assessment.
June 18, 2020
DORA seeks nominations and self-nominations from North and South America to fill two open positions on our international Advisory Board.
May 19, 2020
We are pleased to announce a new briefing document from DORA and colleagues, “Rethinking Research Assessment: Ideas for Action,” which provides five design principles to help universities and research institutions improve their research assessment policies and practices.
April 6, 2020
The COVID-19 pandemic has upended daily life around the world, forcing individuals and organizations to adapt rapidly to unanticipated circumstances. The changes in universities and research institutions have been dramatic.
March 24, 2020
The emergence of COVID-19 has drastically upended the academic enterprise. Because of physical distancing, many non-tenured faculty members are facing additional, unexpected obstacles in their promotion and tenure trajectory. Transitioning classes to online learning environments will detract from research efforts, and winding down laboratory operations will result in a more direct reduction in research output. While trying to stay healthy themselves, many faculty members are also balancing job responsibilities with kids at home, adapting to telework, etc.
March 12, 2020
As Alison Mudditt described in her Scholarly Kitchen post last month, the path to reforming research assessment has been met with significant challenges. We agree with her that culture change is often a slow process. However, as DORA demonstrates, it is possible to identify tangible progress on the path to large-scale research assessment reform.
December 19, 2019
As 2019 winds down, the DORA steering committee and advisory board wanted to highlight the ways research assessment reform has advanced in the last year. From new data on assessment policies to the development of new tools, the scholarly community is taking action to improve research assessment in concrete ways.
May 24, 2019
On Wednesday April 3, 2019, we hosted a #sfDORA community interview with Janet Halliwell to learn more about the 2017 report from the Federation of the Humanities and Social Sciences in Canada, Approaches to Assessing Impacts in the Humanities and Social Sciences. We also wanted to hear about a new project she is working on to improve assessment in the Humanities and Social Sciences (HSS) by creating a standardized vocabulary of terms related to research assessment.
May 16, 2019
DORA turns 6 years old this week. Or, as we like to say, this year DORA reached 14,000—that’s how many people have signed DORA, and they come from more than 100 countries! Each signature represents an individual committed to improving research assessment in their community, in their corner of the world. And 1,300 organizations in more than 75 countries, in signing DORA, have publicly committed to improving their practices in research evaluation and to encouraging positive change in research culture.
April 2, 2019
There certainly is not a magic bullet when it comes to comprehensive and efficient research assessment, whether in the humanities or STEM fields. Publisher prestige currently influences tenure assessments in the humanities, as do journal names in the life sciences.
March 12, 2019
Researchers should not be evaluated based on factors outside of their control. While hiring, promotion, and tenure decisions in academia are in principle based on the merit of scholarly contributions, implicit biases influence who we hire and who we promote. Even if they are unintentional, biases have consequences.At the AAAS meeting in Washington, DC, this February, we explored approaches to addressing bias and increasing the diversity of the academic workforce in the session, “Academic Research Assessment: Reducing Biases in Evaluation."
December 18, 2018
As 2018 comes to a close, we reflect on the past year and the progress the community made to improve the way we evaluate research and researchers.
October 25, 2018
Implicit evaluation criteria can heavily influence researchers’ decisions. But as a global academic community, we can reconcile our priorities with how we evaluate researchers by modifying the standards we use in research evaluations.
October 19, 2018
To mark DORA’s fifth anniversary, we are celebrating with an interview series that focuses on implementing good practices in research assessment. We are pleased to announce our fourth interview is with Prof. Christopher Jackson, Equinor Professor of Basin Analysis, Department of Earth Science & Engineering at Imperial College. He will answer questions about how researchers can help change the culture of research assessment. Participants will have the opportunity to ask questions at the end of the interview.
August 27, 2018
In celebration of DORA’s fifth anniversary, we are hosting a four-part interview series that focuses on the implementation of good practices in research assessment. We are pleased to announce our third interview is with Sir Philip Campbell, Editor-in-Chief of Springer Nature. He will answer questions about the role of publishing in research evaluation. There will be an opportunity for participants to ask questions.
July 18, 2018
We are pleased to announce our second interview is with Dr. Shahid Jameel, Chief Executive Officer of the Wellcome Trust/DBT India Alliance. He will answer questions about his experience with research evaluation in the life sciences and about making funding decisions at the India Alliance. Participants will have an opportunity to ask questions at the end of the interview.
July 6, 2018
Charité asks candidates not just to list their top papers, but also to explain their role in each one, and justify why each constitutes a significant advance to research or clinical practice. This helps committees to see beyond brand name journals, and into the specific skills individuals offer. The portal requires applicants provide a short summary describing how they have pursued the objectives of Open Science including preregistration of studies and publication of negative results. They also outline important collaborative projects, and interactions with stakeholders in industry, patient care, and policy. Questions like these provide insight into individuals’ abilities and personal context, which neither publication lists nor journal impact factors can provide.
May 2, 2018
To celebrate a successful five years, we are kicking-off a live-interview series to hear firsthand from individuals who are improving research assessment in their communities.
March 9, 2018
Publishing is a significant part of a researcher’s career and essential for scientific success. However, biases built into peer review and research assessment create an uneven playing field.