Case Study

University of Maryland Department of Psychology

Interview conducted 10 June, 2024  Compare case studies

In 2022, the Department of Psychology at the University of Maryland underwent a complete overhaul of their evaluation policies, stemming from their motivation to address reproducibility issues in the field and develop more transparent and inclusive evaluation policies. The department chair began this process with DORA’s principles in mind, working alongside administration, faculty, and a small committee to develop new guidelines that codify open science as a core criteria in tenure and promotion review. In their revised policies and guidelines, the department placed an emphasis on supporting scientists in doing research they find most valuable to their field, and giving them opportunities to discuss their work when under review for tenure, promotion, hiring, or merit review. Authority to change review policy lies with the department chair, along with the support of the dean, associate dean for faculty affairs, and staff, which empowered the chair to reform department evaluation guidelines.

Interview originally conducted March 2024. Case study published June 2024.

Who: Organization profile

Country United States
Profile of institution comprehensive university or equivalent
Number of FTE researchers < 100
Organization of research evaluation Faculty/department level
Who is involved? academic leadershipinstitutional administrative staffresearch department staff

What: What changed and the key elements of change

The Psychology Department changed their tenure and promotion procedures, annual review procedures, and hiring procedures. The changes made to these policies were intended to promote transparent research practices, research rigor, inclusivity, and societal impact, while eliminating reputation-based metrics such as Journal Impact Factors (JIF) and citation counts.

The working group began to rewrite policies from scratch, noting that the existing policies lacked guidelines on equitable research assessment and open science. As they began making modifications and consulting with faculty, they found that it was difficult to convince some faculty to allow Journal Impact Factor to be eliminated from the criteria. To address this, the chair practiced routine and open communication with faculty, emphasizing  that “when you incentivize Impact Factor, you disincentivize research on big issues that impact small communities.” After discussion and collaboration in editing the newly written criteria, Journal Impact Factor and citation counts were eliminated.

The new evaluation policies focus on the substance of researchers’ work, moving away from consideration of quantitative measures. In the future, the department hopes to place an even greater emphasis on quality of work and reward people for a selection of their most relevant, high quality work that they are proud of.

New funds were also established for members of the department, including graduate students, who are working on projects that represent the values within the new promotion and tenure policies. Additionally, a new awards system is being established that will reward faculty for pursuing projects that advance diversity, equity, and inclusion and showcase open science practices. 

Why: Motivation for change

The main motivation for reform in the Department of Psychology stemmed from the need to address issues of rigor and reproducibility in the social sciences. As with other disciplines, lack of reproducibility has become increasingly common in the social sciences, in part due to external pressures to publish in journals with high Impact Factors. To address this, the Department of Psychology reformed its tenure and promotion policies to include and consider a wider range of research metrics, giving recognition to a scientist’s work rather than Impact Factor or journal prestige.

How: Processes and dynamics for developing, implementing and managing change

This thorough reform was initiated by the department chair as part of a comprehensive review and overhaul of their evaluation processes for hiring, annual review, promotion, and faculty awards systems. “The steps involved building awareness, seeking permissions from administrators, developing the policy, editing it, and voting it in. The most important part of it was building awareness and socializing the key principles of reform.”

To build awareness, the chair began emailing “Friday updates” in which he would discuss open science, research assessment, and so on. This helped expose all members of the department (e.g., faculty, graduate students, and postdoctoral researchers) to the topic of research assessment itself and give them a look into what responsible research assessment looks like. In these emails, the chair also shared information on DORA and data on the appropriate and inappropriate uses of Impact Factors. He found that the most helpful aspect of this socialization was data sharing, noting that faculty were much more perceptive to considering the problems with Impact Factor when they were presented with actual data on the matter. This helped encourage faculty to get on board with reforming department policies. Keeping faculty engaged in these topics was crucial to its success in reform. The chair referenced and drew inspiration from both his own and others’ publications when discussing the potential for reform within the department.

A working group was formed that consisted of an assistant professor, associate professor, and full professor from within the department. These members were selected not for their complete support of reform, but because they were not entirely convinced. The working group allowed for a variety of opinions that facilitated conversations about considering potential pushback of the criteria. Each member of the committee had a unique perspective from different career levels and was able to provide feedback on what they believed would be valuable components of policy.

After the chair wrote the new criteria, he convened with the working group to make edits. From there, the working group held open calls with faculty to review the newly written policy, receive feedback and edit the policy. After the criteria were more or less complete, they conducted a final round of review and presented the criteria to faculty at different career levels separately. They first collaborated with assistant professors, who suggested their own recommendations. They then met with associate professors, who built on the recommendations of the assistant professors and made their own recommendations. Finally, the revised policy was shared with the full professors for their input. Taking a stepwise approach to gathering faculty feedback helped to ensure the department took earlier career scientists’ opinions into consideration at the start of the process.

Throughout the reform process, the department used a combination of materials to shape their new policies and compiled them into this repository.

When: Timeline for development and implementation

This reform effort was initiated in 2017 by the department chair, and was implemented in the 2022-2023 academic year. Though COVID-19 interrupted the process, the timeline did not waver. The chair even noted that the pandemic gave him an opportunity to step back, take a look at the progress, and consider what still needed to be done. He also highlighted that even during the pandemic when everything turned virtual, the working group was able to have online meetings and discussions to continue editing the criteria. The previous guidelines had not been revised since 2006, so the department was eager to implement a refreshed version that served the interests of scientists and their hard work.

Over the course of 5 years, the aspect that took the longest was the socialization and education of faculty. Interestingly, the chair noted that administration was entirely supportive of his efforts and the group that required the most convincing was the faculty. He found that the data he presented them was most influential in gaining their support. He showed them data on how “prestigious” journals do not necessarily publish “better” research than smaller, less well-known journals, and that the quality of research was independent of the journal in which it was published.