Can Metrics Be Used Responsibly? Why Structural Conditions Push Against This

Not waving, exactly, but...

Not waving, exactly, but…

Today, the long-awaited ‘Metric Tide’ report from the Independent Review of the Role of Metrics in Research Assessment and Management was published, along with appendices detailing its literature review and correlation studies. The main take-away is: IF you’re going to use metrics, you should use them responsibly (NB NOT: You should use metrics and use them responsibly). The findings and ethos are covered in the Times Higher and summarised in Nature by the Review Chair James Wilsdon, and further comments from Stephen Curry (Review team) and Steven Hill (HEFCE) are published. I highly recommend this response to the findings by David Colquhoun. You can also follow #HEFCEMetrics on Twitter for more snippets of the day. Comments by Cambridge lab head Professor Ottoline Leyser were a particular highlight.

I was asked to give a response to the report at today’s launch event, following up on the significance of mine and Pablo’s widely endorsed submission to the review. I am told that the event was recorded by video and audio so I will add links to that when they show up. But before then, a short summary record of the main points I made: Continue reading

Why Metrics Cannot Measure Research Quality: A Response to the HEFCE Consultation

Pacioli Euclid Measurement

Update 24th June: 7,500+ views, 100s of shares, 200+ signatories! And a new post with some responses to further issues raised.

The Higher Education Funding Council for England are reviewing the idea of using metrics (or citation counts) in research assessment. We think using metrics to measure research quality is a terrible idea, and we’ll be sending the response to them below explaining why. The deadline for receiving responses is 12pm on Monday 30th June (to metrics@hefce.ac.uk). If you want to add an endorsement to this paper to be added to what we send to HEFCE, please write your name, role and institutional affiliation below in the comments, or email either ms140[at]soas.ac.uk or p.c.kirby[at]sussex.ac.uk before Saturday 28th June. If you want to write your own response, please feel free to borrow as you like from the ideas below, or append the PDF version of our paper available here.


Response to the Independent Review of the Role of Metrics in Research Assessment
June 2014

Authored by:
Dr Meera Sabaratnam, Lecturer in International Relations, SOAS, University of London
Dr Paul Kirby, Lecturer in International Security, University of Sussex

Summary

Whilst metrics may capture some partial dimensions of research ‘impact’, they cannot be used as any kind of proxy for measuring research ‘quality’. Not only is there no logical connection between citation counts and the quality of academic research, but the adoption of such a system could systematically discriminate against less established scholars and against work by women and ethnic minorities. Moreover, as we know, citation counts are highly vulnerable to gaming and manipulation. The overall effects of using citations as a substantive proxy for either ‘impact’ or ‘quality’ could be extremely deleterious to the standing and quality of UK academic research as a whole.

Why metrics? Why now? Continue reading