Updates from Research England and the Open Research Funders Group

Communication is an important mechanism to increase the uptake of responsible research assessment practices at universities and funding agencies. To help, DORA brings together public and private research funders for a virtual meeting each quarter to discuss new policies, practices, and pilot experiments for assessing research. On Thursday, June 25, 2020, Claire Fraser, senior policy advisor at Research England and Greg Tananbaum, director of the Open Research Funders Group (ORFG), provided updates on their latest efforts to improve research assessment.

Research England is part of UK Research and Innovation (UKRI). They award ~£2 billion per year, the majority of which is allocated as block grants, to higher education providers. Research block grant funding is informed by the outcomes of a national research assessment exercise, called the Research Excellence Framework (REF), which occurs about every six years. Panels made up of senior academics, international members, and research users are responsible for assessing submissions from higher education providers. For each submission, three distinct elements are assessed: the quality of outputs, the impact of research beyond academia, and the environment that supports research and impact generation. Fraser noted that an underpinning principle is that research outputs will be assessed on a transparent, fair, and equal basis. For peer-reviewed articles panels are instructed not to use journal-based metrics or journal prestige in their evaluation. Instead, panels are asked to establish criteria for their assessment formed around the three generic criteria of originality, significance, and rigour. Research outputs that can be submitted are not limited to articles, monographs, and conference contributions. Fraser said that outputs submitted must meet the REF’s definition of research, which is “a process of investigation leading to new insights, effectively shared.” For example, panels also review preprints, software, performance, composition, exhibition, and datasets.

Panel members receive training on how to conduct the REF review process. They are required to adhere to the agreed criteria and apply assessment standards consistently. To foster a sense of personal accountability, the burden of calling out bad practice during the review, such as relying on journal-based indicators, is largely placed on panel members. Four main panels oversee the assessment carried out by the sub-panels to ensure adherence to the criteria, working methods, and equality and diversity guidance. Sixty percent of the REF 2021 quality profile for each submitting unit developed during the assessment is based on outputs. The remainder is split between impact (25%) and environment (15%). The environment narrative statement includes open research strategy and integrity. Fraser noted that research culture could be strengthened in future REF exercises.

Through its work with the National Academies of Science, Engineering, and Medicine Roundtable on Aligning Incentives for Open Science in the United States, the ORFG is working to make academic assessment reflective of open scholarship practices. There is growing understanding of why open science is important, Tananbaum says. So now the Roundtable is using working groups to produce actionable guidance that address how to increase the adoption of open scholarship practices.

One working group has developed signalling language for research assessment policies. The language shows open scholarship is valued by the institution and incentivizes open practices, such as depositing data, protocols, and code in open repositories. Tananbaum has found organizations are more willing to use the signalling language because it is phrased in the form of requests (not requirements). Sixteen funders have adopted or committed to adopt the suggested wording to date. The signalling language encompasses both retrospective activities (asking grant applicants how they have made their work openly available in the past, and how it has been used) and prospective activities (asking grant applicants how they plan to share their work in the future). Variations of these templates have also been created for grant reporting, faculty hiring, and annual faculty reporting. Tananbaum believes that a major reason the signalling language has been successful is because there less bureaucracy to add a request into a policy than to add a requirement. Additionally, the “request” approach can be an interim step in the adoption of more formal policies.

Another ORFG working group is making connections with university provosts and department chairs to discuss open scholarship practices and academic assessment. So far, they have reached out to 54 academic departments in the United States, across 36 institutions and 14 disciplines. These discussions have often included department chairs and clusters of researchers who are already engaging in open science activities. The working group has also found that scholarly societies are good partners to identify the needs and to promote good open science practice within a particular field.

Overall, the Roundtable wants coordinated action to develop open science plans that are appropriate for departments, institutions, disciplines, agencies, and funders. They are prepared to coordinate the effort to understand what open science language, practices, and policies can be realistically implemented at scale across a range of disciplines, organizational types, and geographies.

DORA’s funder discussion group is a community of practice that meets virtually every quarter to discuss policies and topics related to fair and responsible research assessment. If you are a public or private funder of research interested in joining the group, please reach out to DORA’s Program Director, Anna Hatch (info@dora.org). Organizations do not have to be a signatory of DORA to participate.

Share This

Copy Link to Clipboard

Copy