Auditing the Auditors: An evaluation of the REF2021 output results for UoS 16

The UK has recently concluded the Research Excellence Framework (REF2021) exercise, involving 157 universities, 76,132 academic staff and 185,594 research outputs. The nationally constituted panels reviewed the submissions according to three factors: the quality of outputs (60%), the impact beyond academia (25%), and the environment that supports research (15%).

This evaluation has both financial and reputational consequences: the government allocates the block research grant according to both the quality and the quantity of the units’ submissions, while the REF team publishes league tables and assorted statistics on the performance of different universities.

The Stern report (not the climate one) estimated that REF2014 cost £246 million to conduct. Given the high cost devoted to the evaluation exercise, one might ask: how informative are these REF results? Do they deliver value for money? Is there a better way of evaluating research?

Our paper, evaluated the REF2021 evaluation process for Economics (Unit of Assessment 16, Economics and Econometrics), with a primary focus on the output component, which is by far the largest part of the REF, and also the one most amenable to analysis.

There were a total of 25 universities who submitted to UOA16, with a total of 973 staff and 2,232 outputs. Outputs are classified into 4*, 3*, 2*, 1*, and Unclassified (although this was tiny in Economics).

The individual paper specific evaluation by the REF panel was not declared (so unlike the peer review process of journals, there is no feedback and no transparency). To infer the average quality of journals from the REF 2021 output, we used an algorithm first proposed by Hole that finds the journal quality scores that best match the output quality scores at the institution level. Our results show that REF outcomes and the standard journal rating system, the Chartered ABS Academic Journal Guide, are closely related, and the inferred quality at the top to be a close match. The top four implied journals (Quarterly Journal of Economics, Econometrica, Journal of Political Economy and Quantitative Economics) agree with the ABS ranking, which itself agrees with the wider professions’ evaluations.

To help better understand the relationships between REF outcome and the Chartered ABS journal ranking, we converted the quality percentages into an overall Grade Point Average (GPA). Our results show that the REF and the Chartered ABS AJG GPA’s are a close match, with a 91% mutual correlation as shown in the graphic.

We found that other factors such as the average level of citation, the average age of the papers, and the number of submissions by the institution provide no additional explanatory power in the presence of the Chartered ABS GPA.

When we look at the university rankings according to the REF implied GPA and the Chartered ABS implied GPA, there is a high association: UCL ranks first across both measures, Oxford and Cambridge ranked 10th and 11th in the REF implied GPA, with the order reversed for the Chartered ABS ranking.

The very high correlation between the Chartered ABS-implied ranking and the REF implied ranking could arise from two scenarios: either the REF panel spends an extreme amount of time reviewing outputs and comes to the same conclusions as the journals, or the REF panel just implicitly follows the journal hierarchy without much reinterpretation. Either way automated evaluation based on journal labels can deliver almost identical results with less cost.

Presumably, the cost of delivering REF2021 will be shown to be much greater than the £246 million reported by the Stern report for REF2014. Large management teams were formed by universities to deliver the institutions’ results and those teams are already deploying themselves to prepare for the next one. The estimated cost figure presumably did not include the time of the academics involved, and only relates to management hours. For example, OUP guidelines for referees suggest a minimum of two hours work per paper, per round, per referee, per journal. The REF process ignores the huge investment of time by academic referees during the peer review process, leading to journal publication, and asks universities and panels to do this work over again for 185,594 outputs!

In disciplines such as Economics where the journal hierarchy is widely perceived to be informative – imperfectly so, of course - this seems to be potentially overkill. The alternative that is adopted in most countries, including the United States, is to fund research through individual and team grants rather than at the whole university level. If you have a good idea, you ask for funds to support that. At a time when the Government is looking for efficiency savings, perhaps the REF process is a good place to start! Only, please don’t replace it by KPMG.


Professor Oliver Linton is Professor of Political Economy and Director of Research for the Faculty of Economics, University of Cambridge; Emily Xu is an undergraduate student in Economics at the University of Cambridge. 



Battistin, E. and Ovidi, M. (2022), Rising Stars: Expert Reviews and Reputational Yardsticks in the Research Excellence Framework. Economica, 89: 830-848.

Chartered Association of Business School’s Academic Journal Guide (commonly known as the ABS list). (i.e.

Hole, A.R. (2017), Ranking Economics Journals Using Data From a National Research Evaluation Exercise. Oxford Bull Econ Stat, 79: 621-636.

Linton, O.B. and E. Xu (2022). Auditing the Auditors: An evaluation of the REF2021 Output Results. Cambridge Working Papers in Economics, CWPE 2266.

Stern report (2016). framework-review