presentation

Evaluating Research Quality:
The UK approach
David Otley
Distinguished Professor of Accounting & Management (Emeritus)
Lancaster University Management School
Main Panel chair RAE 2008
The UK RAE – now REF 2014




System designed by UK government funding
agency (HEFCE in England)
Objective is to allocate baseline funding to
universities, similar in amount to that provided by
research grants
Based on peer review using subject panels,
covering a 5 year (-ish) period
Emphasis on the best 4 pieces of published work
by each person submitted
(c) 2013 David Otley
Evaluating Research Quality
2
Outcomes

Result is a profile of work for each submitting unit
(e.g. department) graded as:







4* top international quality
3* high international quality
2* reputable international quality
1* national quality only
0* lesser quality or not research work
Funding is allocated on this basis, with a formula
decided after the results are determined
Heavily biased towards top end work
(c) 2013 David Otley
Evaluating Research Quality
3
Consequences (in Business &
Management)




Amount of funding is relatively low and has been
decreasing
For 2014, only 4* and 3* work will attract funding,
and in a ratio of 7:1
Equally, if not more, important is effect of
outcomes on league tables, where a weighting of
4/3/2/1 is generally used
Institutions are not compelled to enter all research
active staff, and may select to achieve some
desired outcome
(c) 2013 David Otley
Evaluating Research Quality
4
Institutions operate more stringent
rules internally



Most commonly to focus on journal articles in
preference to other forms of output
These tend to be evaluated by reference to some
journal list (e.g. ABS list) rather than being peer
reviewed
It is assumed that and article in a N* journal
(however determined) is itself of N* quality


Tendency to follow US norms given location of many
‘top’ journals
Neglect of good work published in lesser journals
(c) 2013 David Otley
Evaluating Research Quality
5
Choice of staff submitted

Odd quirk that allows only selected staff to be
submitted



Less staff submitted will reduce funding (if this means
some good work is not submitted because it is
accompanied by some of a lesser standard)
But it will increase the GPA used in the league tables
Unclear whether proportion of staff submitted will be
reported (due to technical difficulties in counting!)
Implies that many reported grades will be of little value in
assessing departmental quality as they are affected by the
proportion of staff submitted
(c) 2013 David Otley
Evaluating Research Quality
6
New feature for 2014 - Impact



Previous exercises had concentrated on
academic relevance and quality
REF 2014 adds the idea of usefulness to users of
research, in a measure of ‘impact’ weighted at
20% of overall grade
Impact will be assessed on the basis of case
studies, documented with external evidence, at a
rate of one case per 10 members of staff
submitted

Another potential area of manipulation
(c) 2013 David Otley
Evaluating Research Quality
7
Conclusions




By their actions, institutions are imposing a more
mechanistic regime than that intended by the
funding bodies
This impacts on the type of research conducted
and the places in which it is published
Although the evaluation methods may have been
valuable initially, they are becoming increasingly
problematic
The impact of impact remains to be seen!
(c) 2013 David Otley
Evaluating Research Quality
8