糖心Vlog

REF 2021: 糖心Vlog鈥檚 table methodology

How we analyse the results of the Research Excellence Framework

Published on
May 12, 2022
Last updated
July 16, 2024

The data published today by the four UK funding bodies present the proportion of each institution鈥檚 Research Excellence Framework submission, in聽each unit of聽assessment, that falls into each of聽five quality categories.

For output and overall profiles, these are 4*聽(world-leading), 3*聽(internationally excellent), 2*聽(internationally recognised), 1*聽(nationally recognised) and unclassified (below nationally recognised or fails to meet the definition of research).

For impact, they are 4*聽(outstanding in terms of reach and significance), 3*聽(very considerable), 2*聽(considerable), 1*聽(recognised but modest) and unclassified (little or no聽reach or聽significance).

For environment, they are 4*聽(鈥渃onducive to producing research of world-leading quality and enabling outstanding impact, in terms of its vitality and sustainability鈥), 3*聽(internationally excellent research/very considerable impact), 2*聽(internationally recognised research/considerable impact), 1*聽(nationally recognised research/recognised but modest impact) and unclassified (鈥渘ot conducive to producing research of nationally recognised quality or enabling impact of reach and significance鈥).

糖心Vlog

ADVERTISEMENT

For the overall institutional table, 糖心Vlog aggregates these profiles into a single institutional quality profile based on the number of full-time equivalent staff submitted to each unit of assessment. This reflects the view that larger departments should count for more in calculating an institution鈥檚 overall quality.

Institutions are, by default, ranked according to the grade point average (GPA) of their overall quality profiles. GPA聽is calculated by multiplying its percentage of 4*聽research by聽4, its percentage of 3*聽research by聽3, its percentage of 2*聽research by聽2 and its percentage of 1*聽research by聽1; those figures are added together and then divided by聽100 to give a聽score between 0聽and聽4.

糖心Vlog

ADVERTISEMENT

We also present research power scores. These are calculated by multiplying the institution鈥檚 GPA by the total number of full-time equivalent staff submitted, and then scaling that figure such that the highest score in the ranking is聽1,000. This is an attempt to produce an easily comparable score that takes into account volume as well as GPA, reflecting the view that excellence is, to some extent, a function of scale as well as quality. Research power also gives a closer indication of the relative size of the research block grant that each institution is likely to receive on the basis of the REF results.

However, block grants are actually calculated according to funding formulas that currently take no聽account of any research rated聽2* or聽below. The formula is slightly different in Scotland, but in England, and , the 鈥渜uality-related鈥 (QR) funding formula also accords 4*聽research four times the weighting of 3*聽research. Hence, we also offer a聽market share metric. This is calculated by using these quality weightings, along with submitted FTEs, to produce a 鈥溾 score; each institution鈥檚 market share is the proportion of all UK quality-related volume accounted for by that institution.


UK-wide research quality ratings hit new high in expanded assessment
Output v impact: where is your institution strongest?


The 2014 figures are largely taken from THE鈥檚 published rankings for that year 鈥 although research power figures have been retrospectively indexed.

Note that a small number of institutions may have absorbed other institutions since 2014. In these cases, rather than attempting to calculate a 2014 combined score for the merged institutions, we list only the main institution鈥檚 2014 score.

We exclude from the main tables specialist institutions that entered only one unit of assessment (UoA); these are listed instead in the relevant unit of assessment table.

Note that the figure for number of UoAs entered by an institution counts multiple submissions to the same unit of assessment separately.

For data on the share of eligible staff submitted, note that some institutions have a figure greater than 100 per cent. According to the UK funding bodies, this is due to some research staff not being registered in official statistics due to internal employment structures at certain institutions.聽

糖心Vlog

ADVERTISEMENT

The separate tables for outputs, impact and environment are constructed in a similar way, but they take account solely of each institution鈥檚 quality profiles for that specific element of the REF; this year, those elements account for 60, 25 and 15聽per cent of the main score, respectively. These tables exclude a market share measure.

糖心Vlog

ADVERTISEMENT

The subject tables rank institutional submissions to each of the 34聽units of assessment based on the GPA of the institution鈥檚 overall quality profiles in that unit of assessment, as well as its research power. GPAs for output, impact and environment are also provided.

Where a university submitted fewer than four people to a UoA, the funding bodies suppress its quality profiles for impact, environment and outputs, so it is not possible to calculate a聽GPA. This is indicated in the table by a聽dash.


Unit of assessment tables: see who鈥檚 top in your subject
Who's up, who鈥檚 down? See how your institution performed in REF 2021


As before, 2014 scores for the subject tables are taken from THE鈥檚 2014听蝉肠辞谤别蝉. However, there are a small number of cases where UoAs from 2014 have merged or split in聽2021. Whereas there were four separate UoAs for engineering in 2014, there is only one in 2021. For reasons of comparability, we list 2014 scores for the engineering table based on the combined scores of the engineering UoAs in 2014, weighted according to the FTE submitted to聽each.

Contrariwise, while geography, environmental studies and archaeology was a single UoA in 2014, archaeology is a separate UoA in 2021. Since it is not possible to separate out archaeology scores from 2014, the 2014 scores listed for both the archaeology UoA and the geography and environmental studies UoA are the same.

Where an institution did not submit to the relevant unit of assessment in聽2014, the relevant fields are marked 鈥渘/a鈥. Where an institution made multiple submissions to a UoA in 2014 and only one in 2021, the 2014 fields are also marked "n/a".聽

In some UoAs, single institutions have made multiple submissions. These are listed separately and are distinguished by a聽letter: eg, 鈥淯niversity of Applemouth聽A: Nursing鈥 and 鈥淯niversity of Applemouth聽B: Pharmacy鈥.

Where two universities have made joint submissions, these are listed on separate lines and indicated accordingly: eg, 鈥淯niversity of Applemouth (joint submission with University of聽Dayby)鈥. By default, the institution with the higher research power is listed first.

On the landing page for each subject table, we also give GPA and FTE submission figures for the UoA as a whole, based on the 鈥渘ational profile鈥 provided by the funding bodies.

糖心Vlog

ADVERTISEMENT

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please
or
to read this article.

Related articles

Sponsored

Featured jobs

See all jobs
ADVERTISEMENT