STERN Review Outcomes REF preparations and the Future of RAPG Professor Pam Thomas Timeline • July 2016, Stern Review published • By end of December 2016, HEFCE proposals for REF 2021 circulated for consultation with HEIs. • The Stern Review recommends that the decisions made following the consultation should be published by Summer 2017. • If the cycle follows the same pattern as REF 2014 the Guidance on Submissions document would be published in July 2018. • Submissions should be made in 2020 with assessment occurring in 2021 and results published by the end of that year. Similarities with REF 2014 • Weightings are likely to remain the same Outputs – 65% Impact – 20% (this may increase to 25%, with a concomitant decrease in Environment) Environment – 15% • The UoA subject taxonomy is unlikely to change. • The quality profiles (star ratings system) are also very unlikely to change. Stern’s 12 Recommendations 1 – 2 : Outputs Recommendation 1. All research active staff should be returned in the REF. Change In REF 2014, institutions could select staff so as to maximise their GPA. Recommendation 2. Outputs should be submitted at UoA level with a set average number per FTE but with flexibility for some faculty to submit more and others less than the average. Change Four outputs needed to be submitted per person unless there were special circumstances. Under the new proposal, more outputs could be submitted by one individual. Stern’s 12 Recommendations 3 – 4 : Outputs Recommendation 3. Outputs should not be portable. Change In REF 2014 outputs stayed with the author even if they moved institutions prior to the census date. Now, it is proposed that outputs must have been authored at the submitting institution in order to be eligible. This measure is designed to limit the effect of the ‘transfer market’, although it may simply move it to earlier in the cycle. The University will not be able to recruit late in the cycle to boost particular subject areas just prior to submission. Recommendation 4. Panels should continue to assess on the basis of peer review. However, metrics should be provided to support panel members in their assessment; and panels should be transparent about their use. Change No real change other than transparency of usage although it is not clear how this will be achieved. Stern’s 12 Recommendations 5 – 6 : Impact Recommendation 5. Institutions should be given more flexibility to showcase their interdisciplinary and collaborative impacts by submitting ‘institutional’ level impact case studies, part of a new institutional level assessment. Change Institutional level case studies are proposed for the first time. Interdisciplinary review panels will need to be formed or other sufficiently robust methodologies developed to cover the different research areas. Recommendation 6. Impact must be based on research of demonstrable quality. However, case studies could be linked to a research activity and a body of work as well as to a broad range of research outputs. Change In REF 2014, case studies needed to be based on research outputs of 2* quality or higher. The proposal lacks detail but is designed to capture more of universities’ impacts. What is meant by a broad range of outputs? Stern’s 12 Recommendations 7 : Environment Recommendation 7. Guidance on the REF should make it clear that impact case studies should not be narrowly interpreted, need not solely focus on socio-economic impacts but should also include impact on government policy, on public engagement and understanding, on cultural life, on academic impacts outside the field; and impacts on teaching. Change Impact rules and guidelines are likely to cover in more detail impacts in particularly public engagement and teaching which were treated with caution by institutions in REF 2014. Stern’s 12 Recommendations 8 – 9 : Environment Recommendation 8. A new, institutional level Environment assessment should include an account of the institution’s future research environment strategy, a statement of how it supports high quality research and researchrelated activities, including it’s support for interdisciplinary and crossinstitutional initiatives and impact. It should form part of the institutional assessment and should be assessed by a specialist, cross-disciplinary panel. Change Similar to REF 2014 but with cross-disciplinary input. Recommendation 9. That individual UoA environment statements are condensed, made complementary to the institutional level environment statement and include those key metrics on research intensity specific to the UoA. Change Environment statements are likely to be shorter. Stern’s 12 Recommendations 10 – 11 : Wider Context Recommendation 10. Where possible, REF data and metrics should be open, standardised and combinable with other research funders’ data collection processes in order to streamline data collection requirements and reduce the cost of compiling and submitting information. Change Hopefully, this will reduce the data burden but it is difficult to see how this will be achieved. Recommendation 11. That Government and UKRI could make more strategic use of REF, to better understand the health of the UK research base, our research resource and areas of high potential for future development, and to build the case for strong investment in research in the UK. Change Previous exercises have not been widely used or explored other than for funding allocation purposes or league tables. Stern’s 12 Recommendations 12 : Wider Context Recommendation 12. Government should ensure that there is no increased administrative burden to Higher Education Institutions from interactions between the TEF and REF; and that they together strengthen the vital relationship between teaching and research in HEIs. Change Wishful thinking perhaps? Changes to RAPG and REF preparations • Staff selection will no longer be required but institutions but there will still be a need to ensure we optimise the quality of outputs and staff volume. • RAPG was conceived to monitor individual performance and optimise performance in RAE/REF. How should this process change? • Are development, reward and retention policies and strategies optimal post-Stern? • RAPG was very focused on impact in early 2016 which revealed a wide range of stages of development between departments. The detail of impact case studies is not necessarily a strategic focus; is a RAPG meeting the most appropriate forum to review case studies? Should case studies be developed more openly with a wider range of input, review and scrutiny? • Given that outputs are no longer likely to be portable, should RAPG be much more focused on the future recruitment aspect of the department’s research strategy? Changes to RAPG and REF preparations • Internal (and in some cases external) review of outputs prior to REF 2014 submission generated estimated star ratings that were somewhat higher than the actual star ratings awarded in REF 2014. How can we be more confident in the quality of our presubmission review(s)? • Can the institution do more to foster interdisciplinary research and how should we manage the development of University-wide interdisciplinary case studies? • We need to consider carefully how we can support early career researcher. • The aim of Stern has been to design out special staff circumstances.
© Copyright 2024 Paperzz