Annual report: a recap of activities of the San Francisco Declaration of Research Assessment (DORA) in 2019

Over the past year, it has become apparent that the declaration represents just one part of DORA’s portfolio of activities. In 2019, DORA added resources and examples of good practice to the web page, organized sessions at academic conferences, published perspective pieces, hosted virtual events, and co-sponsored our first meeting with the Howard Hughes Medical Institute.

A total of 1,991 new individual and 713 new organizations signed DORA in 2019. As DORA continues to grow, the declaration remains a key tool to drive institutional action. Signatures help hold institutions accountable. So when an advertisement for a postdoc position at ETH Zurich in June called for candidates who published in select journals, the Twitter community was able to use the institution’s commitment to DORA as leverage to change the position’s requirements. ETH Zurich issued an apology, and the advertisement was updated to reflect DORA principles.

Signing DORA also prompts universities to develop concrete plans to improve their research assessment policies and practices. For example, the Universitat Oberta de Catalunya demonstrated its commitment to DORA by releasing an action plan that provides target deadlines for specific actions.

The strategic goals outlined in the DORA Roadmap guide our activities as well as our vision of advancing practical and robust approaches to research assessment:

  • Build awareness of the issues,
  • Catalyze reform and improved behavior, and
  • Extend the disciplinary and geographic reach of DORA.

This recap summarizes DORA’s activities in 2019 and the progress we have made to meet these goals and improve research assessment.

Building awareness of the issues

Many signers seek guidance on how to implement policies and practices that are considered to be “DORA compliant.” While some individuals view signing DORA as a holistic commitment to responsible research assessment, others remain focused on the declaration’s primary recommendation:

“Do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions.”

This narrow interpretation leaves many signers asking what they should do instead. So in January, we published the blog post, “You’ve signed DORA, now what?,” to provide clarification and offer next steps. This was followed by a blog post in May, “DORA: accentuating the positive,” to highlight the aspirational recommendations outlined in the declaration.

DORA hosted two conference sessions this year. In February, DORA used its session at the American Association for the Advancement of Science (AAAS) meeting in Washington, DC, to bring attention to biases in research assessment. At the ASCB│EMBO meeting in December, participants at DORA’s interactive session used a mock-review exercise to informally test different approaches to triaging applications for a faculty search.

Conference sessions help DORA raise awareness of the issues and surface new ideas, but participation is limited to meeting attendees. So the DORA blog and other forms of communication, including perspective pieces and social media, help spread information more broadly. For example, the summary of the career enhancement programming session, ”How to improve research assessment for hiring and funding decisions,” at the 2018 ASCB│EMBO meeting was published in April as blog post for Inside eLife. During the session, participants worked in small groups to provide practical feedback on applications for grant funding and faculty positions.

In December, Nature published a World View column written by DORA’s program director, Anna Hatch, “To fix research assessment, swap slogans for definitions,” that encouraged the academic community to use conceptual clarity as a mechanism to improve researcher evaluation. Broad terminology like “world-class” or “high-impact” increases room for misinterpretation during research evaluation, permits uneven comparisons, and helps perpetuate the status quo.

The number of DORA’s Twitter followers doubled in 2019, bringing the total above 6,000. Social media continues to be a primary way that DORA interacts with members of the scholarly community. New policies and practices are shared through DORA’s Twitter account (@DORAssessment), as are research findings and other information.

Catalyzing reform

A meta-research study published in January revealed gender inequalities among co-first author contributors on research papers and remarked on the downstream effects it has on the ways we evaluate researchers, underscoring the power that publishers have in the academic reward and incentive system. The majority of DORA’s organizational signers are journals and publishers. To offer ideas about what else journals can do to support research assessment reform in addition to abandoning the impact factor, Steering Committee member Mark Patterson and program director Anna Hatch wrote a feature published by the Science Editor in April, ”How Journals and Publishers Can Help to Reform Research Assessment.” The piece includes a call to action with 10 recommendations that are achievable by most journals and publishers. 

In October, DORA co-sponsored a meeting with the Howard Hughes Medical Institute (HHMI):  Driving Institutional Change for Research Assessment Reform. This was DORA’s first major convening, bringing together a diverse group of 60 stakeholders to discuss new practices in research assessment and explore different approaches to culture change. More than 30 research institutions in North America and Europe were represented. While the meeting focused on institutional change in the United States, the webcast and resources generated are intended for a global audience. The meeting webpage includes links to the recording of the sessions that were webcast on YouTube, participant commentaries, and a curated selection of background reading. A briefing document that highlights five persistent myths in research evaluation and offers five design principles to help institutions experiment with new policies and practices was developed by DORA in collaboration with Ruth Schmidt, assistant professor at the Illinois Institute of Technology based on discussions at the meeting.

DORA collaborates with other organizations too. For example, DORA collected community feedback on the Résumé for Researchers that was released by the Royal Society in the United Kingdom in October. This type of structured narrative format is designed to support the evaluation of varied academic contributions in addition to traditional peer-reviewed research articles.

One core DORA activity is curating examples of good practices and other resources that stakeholders can use to improve their policies and practices. At the end of the year, there were 28 examples of good practice, including 13 from funders, 2 from societies and 12 from research institutions, and a list of 15 resources on the web page.

Extending DORA’s reach

Research assessment is a global challenge that is not limited to the misapplication of journal-based metrics. The DORA interview series explores specific challenges related to research assessment and surfaces new practices. In 2019, the series featured thought leaders from Mexico, Argentina, Canada, and Uruguay, who discussed how publishing and open data relate to research assessment as well as exploring responsible research assessment practices in the humanities and social sciences.

The declaration continues to be translated into other languages. There are 20 translations in total; the ones added in 2019 include Arabic, Turkish, Ukrainian, Greek, Serbian, Japanese, Indonesian, Slovenian, and Korean.

The work of DORA’s international Advisory Board continues, meeting regularly in association with the Steering Committee. A key role of this board is to ensure DORA keeps in touch with relevant international initiatives.

New research and policy changes

Scholarly communications research reveals intervention points in review, promotion, and tenure policy and practice. Steering Committee member Erin McKiernan and her colleagues published an article in July titled “Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations.” They found 40% of research-intensive universities mention the Journal Impact Factor in review, promotion, and tenure documents, providing an important benchmark for future studies to track trends of its use. This was a follow-up study that used the same dataset previously to probe how the public dimensions of faculty work were acknowledged in review, promotion, and tenure.

The report of the expert group to the European Commission titled, “Future of scholarly publishing and scholarly communication,” was released in January, and recommends that researchers and universities incorporate recommendations from DORA into assessment practices. In the Netherlands, three organizations have banded together to spur change. The KNAW (Royal Netherlands Academy of Arts and Sciences), NWO (Netherlands Organization for Scientific Research), and ZonMw (Netherlands Organization for Health Research and Development) signed DORA in April and issued a joint statement outlining follow-up actions to ensure DORA principles become rooted in their assessment processes. The group followed this up with a position paper in November, “Room for everyone’s talent,” that references DORA and further identifies five goals and specific next steps for stakeholders. As part of this effort, the Dutch National Research Council (NWO) piloted a narrative CV format in its major funding stream for early career researchers – the Veni Scheme. Because of its success, the narrative CV was expanded to its Vici funding scheme.

In Latin America, Redalyc, LatIndex, and CLACSO published an open letter together in January urging broad support for DORA principles. In November, CLACSO also sponsored a meeting with the National Research and Technology Council of Mexico (CONACYT) for the FOLEC-Foro Latinoamericano sobre Evaluación Científica (Latin American Forum on Scientific Evaluation), which aims to develop local indicators to improve research evaluation in the humanities and social sciences.

Looking ahead

DORA capped off the year by publishing our annual list of the top 10 advances in research assessment that are selected by the Steering Committee and Advisory Board. While progress is being made and new practices are emerging, widespread cultural change is going to take time. A major accomplishment in 2019 was the development of governance procedures for DORA, which clarify DORA’s relationship with its supporting organizations. The governance procedures are freely accessible on the DORA homepage.

In addition to DORA’s core activities in 2020, we also seek to strengthen stakeholder relationships and increase communication about new policies and practices. For example, DORA is organizing a virtual discussion series for public and private funders to discuss the progress and outcomes of new policies or pilot initiatives.

DORA is also reaching out to the library community through a series of webinars. Libraries are not specifically mentioned in the declaration, but librarians provide critical bibliometric expertise to academic institutions and are well positioned to initiate change on campus through relationship building.

By building stronger relationships with and actively engaging stakeholders, DORA aims to spread good practices more widely.