Case Study

University Medical Center Utrecht

Interview conducted 29 September, 2020  Compare case studies

University Medical Center (UMC) Utrecht changed its approach to academic assessment through the development and implementation of a new evaluation framework. The purpose of the framework is to move beyond bibliometrics-based evaluations; it formally requires qualitative indicators and a descriptive portfolio when making hiring and promotion decisions. While change was stimulated at the top by the Dean of Research, faculty members provided input into the development of new evaluation criteria. In 2019, UMC Utrecht published the results and outcomes of their new research evaluation approach, reviewing the period 2013-2018.

UMC Utrecht carried out their evaluation under the flag of the national research evaluation plan, anticipating and adding to changes (e.g., emphasizing Open Science practices) that the Association of Universities in the Netherlands will introduce in 2021. From that date all universities in the Netherlands will emphasize Open Science in their periodic research evaluations according to the Strategy Evaluation Protocol.

Interview originally conducted September 2020. Case study updated August 2023.

Who: Organization profile

Country Netherlands
Profile of institution specialized university or equivalent
Number of FTE researchers > 1,000
Organization of research evaluation faculty/department levelsInstitutional/university level
Who is involved? academic leadershipacademic researcherspolicy staff

What: What changed and the key elements of change

UMC Utrecht changed their evaluation framework at multiple levels: institutionally, and for faculty and PhD candidates. In addition, changes were made to the evaluation procedure itself. This framework, a locally adapted version of the national Standard Evaluation Protocol (SEP), can be used to judge groups or teams, not just individuals.

  • At the institutional level, research was organized in trans-divisional and multidisciplinary disease-based programs where the evaluation of research was focused on scientific output in relation to patient stakeholders and societal impact.
  • At the level of individual faculty, UMC Utrecht replaced CVs with portfolios for promotion decisions. Within the portfolio, applicants are asked to discuss achievements in relation to five UMC Utrecht mission-based domains.
  • At the level of PhD candidates, UMC Utrecht developed and implemented annual evaluation forms that make visible a broad range of possible predoctoral accomplishments.1

As part of this reformed evaluation framework, UMC Utrecht formally requires and enforces consideration of qualitative indicators and a descriptive portfolio in order to broaden the discussion around each candidate.

Recently, UMC Utrecht defined six academic career profiles at the associate professor level. The six profiles consist of a summary, expected outcomes and activities, and evaluation criteria, including suggested indicators. The career profiles are: Academic educator, Clinical researcher, Exploratory researcher, Implementation researcher, Methdology & Technology researcher and Valorization researcher. Aspiring associate professors have to choose a career profile and fill out a related qualification portfolio, which then is judged by a promotion committee.

Why: Motivation for change

The purpose of the reform was to “create policies that ensured individual researchers would be judged on their actual contributions and not to the counts of the publications” and to encourage “research programmes […] geared towards creating societal impact and not just scientific excellence.”2 As UMC Utrecht is a university medical center (UMC), there was also early discussion regarding the “mismatch between the mission of UMCs and the incentive and reward system for researchers.”3

Their approaches to introducing new academic career profiles have been aimed at acknowledging and describing a wide range of academic research activities and different research contexts that are not limited to quantifiable output. Current dominant research evaluation criteria are often unintendedly reductive and formative as they evaluate researchers to their quantifiable output that make them ‘valuable’. By introducing these new roles, UMC Utrecht is attempting to stimulate diversity and steer away from the often implicit universalist notions of quality.

How: Processes and dynamics for developing, implementing and managing change

Change at UMC Utrecht was “stimulated at the top” but the “criteria were influenced by the faculty members who expect to be judged by those standards.”2 Academic assessment is now under the jurisdiction of the Research Office and Human Resources.

Before the formal policy changes at UMC Utrecht, informal discussions about the state of science were organized by a group called Science in Transition (SIT). The group was formed by four senior researchers, one of whom was the Dean of Research. SIT began holding workshops attended by diverse populations of researchers at UMC Utrecht, ranging from PhD students to department heads.

The workshops opened up topics for discussion that led to the development of new academic assessment policies. For example, participants questioned the fact that dominant measures of research quality applied to all of the disciplines in the center and debated whether all research papers should be counted and valued equally in evaluations. Having top-down support from the Dean was instrumental in early efforts to promote reform discussion and visibility of the policy development process. Additionally, top-down support made room for the evolution of attitudes toward the new evaluation framework.

An autonomous committee of ~15 mid-career scientists and clinicians then came up with a revised definition of excellence and new evaluation policies.4 Using the UK Research Excellence Framework (REF) as a guide, the committee developed a “suite of semi-qualitative indicators that include conventional outcome measurements, evaluations of leadership and citizenship across UMC Utrecht and other communities, as well as assessments of structure and process, such as how research questions are formed and results disseminated.”1

Specific obstacles faced were: resistance to research assessment reform from researchers; absence of incentivizing policies or guidelines (e.g., national/regional governments, research funding organizations); and alignment of institutional assessment procedures with nationally and internationally dominant procedures.

When: Timeline for development and implementation

Science in Transition (SIT) was formed in 2012 by a group of researchers invested in the creation of a better research culture. In 2013, SIT hosted workshops and published a position paper in which they concluded that “bibliometrics were overemphasized and societal impact was undervalued.”5

In two successive institutional research evaluations in 2013 and 2019, UMC Utrecht increasingly emphasized societal impact and Open Science. The portfolio to facilitate the evaluation of professors and associate professors was introduced from 2016 onward. A new PhD evaluation form has been under development since 2019.

Further and deeper analysis of the impact of the new evaluation framework is underway in collaboration with the Centre for Science and Technology Studies at Leiden University. Going forward, UMC Utrecht will adhere to the new national evaluation framework, the Strategy Evaluation Procedure, effective 2021-2027 across The Netherlands.6

The six academic career profiles were developed and tested between 2021 and 2022. In December 2022, the first associate professors were appointed at UMC Utrecht in the new profiles.

References

  1. Algra, A., Koopman, I., & Snoek, R. How young researchers can re-shape the evaluation of their work. Nature Index (2020). Retrieved 22 November 2020 from: https://www.natureindex.com/news-blog/how-young-researchers-can-re-shape-research-evaluation-universities
  2. Benedictus, R., Miedema, F. & Ferguson, M. W. J. Fewer numbers, better science. Nat. News 538, 453 (2016).
  3. Benedictus, R. Changing the academic reward system, the UMC Utrecht perspective. Open Working (2018). Retrieved 22 November 2020 from: https://openworking.wordpress.com/2018/06/24/changing-the-academic-reward-system-the-umc-utrecht-perspective/
  4. Guide for reviewers/evaluators that use the UMC Utrecht indicators for impact. Retrieved 22 November 2020 from: https://assets-eu-01.kc-usercontent.com/546dd520-97db-01b7-154d-79bb6d950a2d/a2704152-2d16-4f40-9a4b-33db23d1353e/Format-Impact-indicator-evaluation-pilot-incl-introduction.pdf
  5. Dijstelbloem, H., Miedema, F., Huisman, F. & Mijnhardt, W. Why Science Does Not Work as It Should And What To Do about It. (2013). Retrieved 23 November 2020 from: http://scienceintransition.nl/app/uploads/2013/10/Science-in-Transition-Position-Paper-final.pdf
  6. Strategy Evaluation Protocol 2021-2027. VSNU, KNAW, & NWO (2020). Retrieved 22 November 2020 from: https://www.nwo.nl/sites/nwo/files/documents/SEP_2021-2027.pdf