PLOS Response to the HEFCE RFI on Metrics in Research Assessment

Bookmark and Share

The Higher Education Funding Council for England, the body that manages the UK’s Research Excellence Framework recently announced an enquiry on the use of metrics in research assessment. HEFCE’s views on research assessment matter a great deal to UK Universities because the REF distributes a substantial proportion of the UK’s research funding as block grants on the basis of that assessment. As part of this process the enquiry committee issued a call for evidence. The covering letter and summary of the PLOS submission are provided below, you can find the full PLOS RFI response at Figshare.

Dear Committee Members

Thank you for the opportunity to respond to your call for evidence. PLOS has been at the forefront of experimenting with and advocating for new modes of research assessment for a decade. Recent developments such as DORA and your own enquiry suggest that the time is appropriate for a substantial consideration of our approaches and tools for research assessment.

Our ability to track the use of research through online interactions has increased at an unprecedented rate providing new forms of data that might be used to inform resource allocation. At the same time the research community has profound misgivings, as demonstrated by submissions to your enquiry by e.g. David Colquhoun or by Meera Sabaratnam and Paul Kirby with the “metrication” of research evaluation (although see also a response by Steve Fuller. These disparate strands do not however need to be in direct opposition.

As a research community we are experienced with working with imperfect and limited evidence. Neither extreme uncritical adoption of data, nor wholesale rejection of potentially useful evidence should be countenanced. Rather we should use all the critical faculties that we bring to research itself to gather and critique evidence that is relevant to the question at hand. We would argue that determining the usefulness of any given indicator or proxy, whether qualitative or quantitative depends on the question or decision at hand.

In establishing the value of any given indicator or proxy for assisting in answering a specific question we should therefore bring a critical scholarly perspective to the quality of data, the appropriateness of any analysis framework or model as well as to how the question is framed. Such considerations may draw on approaches from the quantitative sciences, social sciences or the humanities or ideally a combination of all of them. And in doing so they must adhere to scholarly standards of transparency and data availability.

In summary, therefore, we will argue in answers to the questions you pose that there are many new (and old) sources of data that will be valuable in providing quantitative and qualitative evidence in supporting evaluative and resource allocation decisions associated with research assessment. The application of this data and its analysis to date has been both naive and limited by issues of access to underlying data and proprietary control. To enable a rich critical analysis requires that we work to ensure that data is openly available, that its analysis is transparent and reproducible, and that its production and use is subject to full scholarly critique.

Yours truly,
Cameron Neylon
Advocacy Director
PLOS

Summary of Submission

  1. The increasing availability of data on the use and impact of research outputs as a result of the movement of scholarship online offers an unprecedented opportunity to support evidence-based decision-making in research resource allocation decisions.
  2. The use of quantitative or metrics-based assessment across the whole research enterprise (e.g. in a future REF) is premature, because both our access to data and our understanding of its quality and the tools for its analysis are limited. In addition, it is unclear whether any unique quality of research influence or impact is sufficiently general to be measured.
  3. To support the improvement of data quality, sophisticated and appropriate analysis and scholarly critique of the analysis and application of data, it is crucial that the underlying usage data used to support decision making be open.
  4. To gain acceptance of the use of this evidence in resource allocation decisions, it is crucial that the various stakeholder communities be engaged in a discussion of the quality, analysis and application of such data. Such a discussion must be underpinned by transparent approaches and systems that support the community engagement that will lead to trust.
  5. HEFCE should take a global leadership position in supporting the creation of a future data and analysis environment in which a wide range of indicators acting as proxies for many diverse forms of research impact (in its broadest sense) are openly available for community analysis, use and critique. HEFCE is well placed, alongside other key stakeholders to support pilots and community development towards trusted community observatories of the research enterprise.

Creative Commons License
PLOS Response to the HEFCE RFI on Metrics in Research Assessment by PLOS Opens, unless otherwise expressly stated, is licensed under a Creative Commons Attribution 4.0 International License.

Download as ePub
This entry was posted in RFI and tagged , , , , , . Bookmark the permalink.

One Response to PLOS Response to the HEFCE RFI on Metrics in Research Assessment

  1. Pingback: Weekend reads: Fallout from STAP stem cell retractions, confessed HIV vaccine fraudster pleads not guilty | Retraction Watch

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>