ICYP (Improving Choices for Young People) Group feedback on the Key Stage 4 (KS4) and Key Stage 5 (KS5) Destination Measures 1. Introduction 1.1 The ICYP Group is part of the formal governance structure of Young People’s Education and Skills (YPES) at London Councils. The remit of the ICYP Group includes a responsibility to share expertise relating to the progression of young people through their education, especially from a local authority perspective. The YPES team consequently discussed the Department for Education’s (DfE) presentation on their KS4 and KS5 destination measure work at the ICYP Group Meeting on 10 June 2010. 1.2 Below is a summary of the responses received from ICYP Group members. They are intended to offer practitioner level context and insight into this work, and have been organised around the following themes: Added-value offered by destination measures Potential areas of difficulty and risks Suggestions to enhance the measures 1.3 For further elaboration or details on any of the points covered in this paper, contact: [email protected]. 2. Added-value offered by destination measures 2.1 The ICYP welcomed the announcement in the White Paper, The Importance of Teaching, that destination measures were being introduced, and the opportunity to view the emerging details of the work being undertaken to deliver these measures. 2.2 Young people’s progress in learning and training has been a notoriously difficult area to measure, yet a vital indication of the effectiveness of their educational experience. 14-19 and London: an evidence base (jointly produced by the Young People’s Learning Agency and YPES) highlighted that high dropout at 17 indicated that there was potentially an issue concerning the progression of young people through education. There was, however, a lack of further hard evidence to allow the issue to be unpicked. 2.3 The creation of KS4 and KS5 destination measures is therefore welcomed for several reasons. Not only does it create greater accountability and incentive for institutions to effectively steer their students on to positive post 16destinations, but it also has the potential to enhance understanding in an important area of performance that has been absent historically. 3. Potential areas of difficulty and risks 3.1 The example of what a KS5 measure might look like (slide 6) makes a specific reference to the percentage of students with an Oxford or Cambridge destination. One member of the ICYP Group expressed the concern that creating a specific category for Oxbridge students could be divisive and problematic: It has the potential to skew the real value of courses. Courses offered by other universities, for example, are often ranked higher than the equivalent Oxbridge courses (engineering at Imperial for example). Furthermore, certain courses may be unavailable at Oxbridge such as specialist art or drama courses or the wide range of vocational courses including nursing. Specific reference to Oxbridge destinations could reinforce the intense focus many schools place on Oxbridge as a preferable destination; this has the potential to hinder informative and impartial advice on the full range of options available to young people. 3.2 It was also felt that the KS5 approach was focussed too narrowly on higher education (HE), and did not represent the destinations of the majority of 18 year olds who do not progress to HE. 3.3 For both the KS4 and KS5 measure examples (slides 5 and 6) it was queried at what point the percentage of students that progressed to a positive destination within one year, would be measured as this only records the destination ‘entered’. Concern was expressed over whether the measure would therefore capture if a student had left or transferred during the year. There is a risk that the measure would not show how young people faired in year 12 i.e. if they had completed a one year course, or if they had dropped out. 3.4 It was also queried whether actual or intended destinations would be counted as part of the measure. It was suggested that for the measure to be robust actuals would need to be used. 3.5 One local authority felt that the wrong measure was being considered. The measure proposed in the presentation assumes a straight three A level option is taken. However, in a significant number of post-16 providers for years 12 and 13, many students do no follow this pattern. The example of the KS5 measure only looks at what KS5 leavers in provision do next, which does not necessarily give meaningful data about learners having been well supported. Colleges supporting learners who complete year 11 with grades significantly below 5 GCSE A*-C are likely to have a much ‘poorer’ profile than the straight three A level option. There appears to be no recognition of the different learning requirements of young people with special educational needs and/or disabilities and the positive destinations that can come from this learning e.g. independent living. 3.6 Some issues regarding the integrity of data were highlighted by the Group: Schools data can be high risk as common standards are not always applied to the collection of data by all schools If post-16 enrolment data and Apprenticeship starts are to be gathered from the YPLA and NAS, there is risk that delays could occur as the data requires ministerial validation and it can be difficult to match NAS data There was a feeling that there needs to be tighter control on who counts what and on collecting ‘actual’ data 4. Suggestions to enhance the measures 4.1 One proposal for the measures was to include the number of students that gained top UCAS points (for example over 360 points), which would demonstrate that students had been equipped with the qualification to apply for ‘top’ universities, whilst also listing the number of students that successfully completed the first year of study. This would also overcome the risk/problem of showing a preference towards certain universities such as Oxford and Cambridge. 4.2 Another recommendation put forward was that the base cohort used in the measure should be those who left year 11 two years before, and not those in a particular provision at the end of year 13. It was suggested that the measure should demonstrate whether a provider’s leavers have developed since year 11 by the destination they arrive at i.e. illustrate the distance travelled in qualification (which therefore will be indicative of the opportunities open to that learner). Below is an example of what this measure could look like in practice: 4.3 After two years of post-16 EET: o number of students who have attained level 3 and going to HE or gap year o number of students who attained additional qualifications to year 11 profile and intending to stay in further education o number of students who attained additional qualifications to year 11 profile and moved onto employment Several members of the group highlighted the added value that could be offered by using National Client Caseload Management Information System (NCCIS) data to support the development of the measure: NCCIS provides the data from the annual Activity (destinations) Survey, which has built-in data controls to ensure integrity via the DfE’s upload system. NCCIS data is timely (data is sent to the DfE in November with final signoff in January/February) DfE manage cross-border matching by residency NCCIS is based on individual learners, so captures all education, employment and training (EET) destinations including work if known NCCIS can track back to show the destinations for school leavers by two or three years, with an accuracy of about 85% NCCIS data should make it possible to track learners across providers 4.4 It was also suggested that attainment after three years of post-16 education would provide a valuable perspective of leaner progression. 4.5 One borough has been using detailed individual schools reports to highlight destinations to encourage schools to consider improvements in teaching, learning, support etc. where necessary. The reports use CCIS data to provide summary and individual level details of pupils currently aged 16 - 22 who became NEET after finishing their statutory schooling at a particular school. Examples of the reports can found below in Appendix 1 and 2. Appendix 1: NEET Churn summary (based on leavers aged 16 - 22 as at 5/4/11) Appendix 2: NEET Churn detailed report (based on leavers aged 16 - 22 as at 5th April 2011)
© Copyright 2026 Paperzz