By Anna Hatch and Stephen Curry (DORA)
Universities cannot achieve their missions and visions if their stated values are out of line with research assessment policies and practices. Although most university mission statements specify research, teaching, and public service as their central commitments, contributions to research are often valued at the expense of teaching and public service. How serious is this misalignment and what can be done about it?
Mission and vision statements often also convey other valued aspects of scholarly work. For example, Cornell University explicitly mentions its collaborative culture.
“Cornell aspires to be the exemplary comprehensive research university for the 21st century. Faculty, staff and students thrive at Cornell because of its unparalleled combination of quality and breadth; its open, collaborative and innovative culture; its founding commitment to diversity and inclusion; its vibrant rural and urban campuses; and its land-grant legacy of public engagement.”
The University of California, Los Angeles places an emphasis on open access, respect, and inclusion.
“UCLA’s primary purpose as a public research university is the creation, dissemination, preservation and application of knowledge for the betterment of our global society. To fulfill this mission, UCLA is committed to academic freedom in its fullest terms: We value open access to information, free and lively debate conducted with mutual respect for individuals, and freedom from intolerance. In all of our pursuits, we strive at once for excellence and diversity, recognizing that openness and inclusion produce true quality.”
Excerpt from the University of California, Los Angeles mission statement
The public dimensions of scholarly work directly relate to the public missions of many universities, but they are still commonly undervalued in review, promotion, and tenure policies. At DORA we believe that the clearest route for research institutes to enhance their research assessment policies and practices is to build them on the solid foundation of their institutional values.
Closing the gap
This is easier said than done, but there are excellent examples of practical steps that can be taken. For instance, working groups can help institutions co-create how their values are to be embodied in research assessment policies and practices. In particular, by bringing together a diverse and representative group of university members, standards and processes for evaluation can be developed that have buy-in the staff who are most likely to be called upon to conduct assessments, for example in recruitment or promotion processes.
Working groups can operate in different ways. For example, the University Medical Center (UMC) Utrecht hosted a series of meetings to collect input on research assessment from the academic community on campus. Policies were developed based on the feedback that was received. The Universitat Oberta de Catalunya (UOC) used a different strategy and assembled a formal task force to consider how to improve their research evaluation processes prior to signing DORA. This led to the creation of a multi-year action plan for the university to implement DORA, which includes recognizing the value of a broad set of outputs and outcomes from their research.
Journal-based evaluation is so deeply rooted in academic culture that new policies alone are not guaranteed to bring about change in how researchers are assessed. Key to that change is building trust that new, values-based policies will be enacted.
Community engagement is essential for building that trust – and for aligning policies and practice. While workshops are an excellent tool for involving staff, they cannot involve everyone so communication with the wider academic community at the institution is vital. UMC Utrecht made sure that reports from their workshop discussions along with interviews from participants were published on the university’s internal website to engage the whole academic community on campus.
There are other ways to engage academics in research assessment reform. UOC is currently building support for its policy changes through presentations and training sessions on campus. Imperial College London hosted a half-day workshop in 2018 to discuss how the landscape of research assessment is changing.
Transparency is another key element for building trust in research assessment policies and practices. While there are many ways to increase transparency, rubrics (i.e., criteria of assessment) offer a versatile option. They can be shared as information with applicants at the beginning of the process, or used to provide individualized feedback when the assessment is complete. The University of California, Berkeley created a Rubric to Assess Candidate Contributions to Diversity, Equity, and Inclusion that departments can use. To increase consistency in its teaching evaluations, the Center for Teaching Excellence at the University of Kansas developed criteria that spanned seven dimensions of teaching practice.
Departments and institutions may also choose to openly share information about the integrity of the assessment process with applicants. For example, are applicants de-identified at any of the steps? Do applicants have faculty advocates (as happens in the Cell Biology Department at the University of Texas Southwestern Medical Center)? Or is an independent observer present during the decision-making discussions (a practice followed at Berlin’s Charité University hospital)? The more information that is shared about evaluation processes, the greater their credibility among those who are evaluated.
Approaching research assessment reform can be daunting – a mountain to climb. There are many areas where we could and should do better, but where to start? We believe that leveraging institutional values to drive change is the most natural route forward, harnessing principles that are shared by most scholars and researchers. And we hope that the examples given above show some of the steps that others have already taken. All it takes to climb a mountain is to proceed one step at a time.