the evaluation of the raising the participation age ......research report dfe-rr236b the evaluation...

30
Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to 2012: survey findings Simon Day, Leigh Sandals, Kelly Kettlewell, Claire Easton & Ben Durbin ISOS Partnership & National Foundation for Educational Research

Upload: others

Post on 05-Oct-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

Research Report DFE-RR236b

The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to 2012: survey findings Simon Day, Leigh Sandals, Kelly Kettlewell, Claire Easton & Ben Durbin ISOS Partnership & National Foundation for Educational Research

Page 2: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

The views expressed in this report are the authors’ and do not necessarily reflect those of the Department for Education.

Page 3: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

2

Published in September 2012 by Isos and the National Foundation for Educational Research, www.isospartnership.com and www.nfer.ac.uk © Isos and National Foundation for Educational Research 2012 How to cite this publication: Day, S., Sandals L., Kettlewell, K., Easton, C., and Durbin, B. (2012). The evaluation of the Raising the Participation Age (RPA) Locally-Led Delivery Projects 2011 to 2012: survey findings (DfE Final Report). Isos/NFER

Page 4: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

3

Contents

Executive summary 5

Summary of key findings 6

1. Introduction 9

1.1 Background to RPA 9

1.2 Focus of the LLDPs 10

1.3 Evaluation methodology 11

1.4 Final reports and updated tools 12

2. Findings from the surveys 13

2.1 Overall confidence levels 13

2.2 Levels of ambition 14

2.3 Priority cohort groups 16

2.4 Achieving their targets 17

2.5 Scale and range of activity 17

2.6 Local areas’ expenditure 19

2.7 Activity costs 20

2.8 Success measures 23

3. Measuring value for money (VfM) 25

3.1 Approach to VfM 25

3.2 VfM conclusions 25

4. Conclusions and recommendations 27

Page 5: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

4

Acknowledgements

We are grateful to the Department for Education (DfE) officials and all of the project areas for their support and involvement in this evaluation. In particular, we are grateful to the project leads for their engagement with the research.

Page 6: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

5

Executive summary

Background

1. The Education and Skills Act (2008) increased the minimum age at which young people in England can leave learning. From 2013, young people will be required to continue in education or training until the end of the academic year in which they turn 17 and from 2015 they will be required to continue until their 18th birthday.

2. Raising the Participation Age (RPA) does not mean young people must stay in school; they will be able to choose one of the following options:

• full-time education, such as school, college or home education; • an Apprenticeship; or • full-time work with part-time education or training.

3. Work has already been undertaken to prepare for RPA at a local level. The previous phases of local work – the RPA trials – have already shown some of the steps that areas can take to prepare for full participation. In Phase 1 (September 2009 – March 2010) 10 local authorities (LAs) and one sub-regional group (SRG) focused on one of three specific themes: Information Advice and Guidance; Re-engagement of 16- and 17-year-olds; and the development of area-wide local solutions. In Phase 2 (April 2010 – March 2011) four new LAs and one SRG joined the programme. Areas were asked to maintain an in-depth focus on specific trial models in order to establish best practice on implementation of RPA.

4. This phase of RPA Locally-Led Delivery Projects (LLDPs) (April 2011 – March 2012) is different from the previous trials. It has focused on local determination of the challenges to be addressed and the actions local areas1 could take to develop their approaches to increasing the numbers of young people continuing in education or training in the run-up to 2013 and 2015. Areas were asked to identify their priorities and to develop and test their own approaches to address these, rather than focusing on a specific theme. Nineteen individual LAs and three SRGs (comprising 16 individual LAs) have participated in the LLDPs.

Methodology

5. DfE commissioned Isos Partnership (Isos) and the National Foundation for Education Research (NFER) to undertake an evaluation of the LLDPs in August 2011. The evaluation team used a mixed-method approach to explore the impact of the LLDPs including:

• a baseline survey (carried out during September and October 2011);

• case study visits (carried out from December 2011 to March 2012); and

• a follow-up survey (carried out during March and April 2012).

1 The term ‘area/s’ is used to refer to the LAs and/or SRGs collectively.

Page 7: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

6

6. The baseline and follow-up surveys aimed to measure local areas’ aspirations, levels of confidence, costs, activities and measures of success. This report presents the survey findings. Evidence collected from 18 case study visits completed between December 2011 and March 2012 can be found in the separate report of case study findings published alongside this report. 7. The report is structured around the six RPA priorities identified in earlier evaluations of the RPA trials. These are:

• Priority one: Understanding the Cohort • Priority two: Determining Local Priorities • Priority three: Managing Transitions and Tracking • Priority four: Establishing Support Mechanisms • Priority five: Identifying and Meeting Provision Needs • Priority six: Communicating the RPA Message.

Summary of key findings

Overall progress, levels of confidence and ambition

• There has been a greater range and scale of activity seen in the LLDPs than in the

previous phases of the RPA trials. • At the point of the baseline survey, in October 2011, local areas reported feeling

confident about how prepared they were for achieving RPA. Across the six priorities, between 18 and 23 areas2 reported feeling ‘quite’ or ‘very’ confident in October 2011.

• This confidence appears to have increased between the baseline and final surveys. The final survey showed that, in April 2012, between 15 and 243 areas for each of the six priorities, reported feeling ‘more confident’ than they did in October 2011.

• Although areas felt confident, some had set ambitious targets for 2014/15. Some needed to raise participation rates by up to 19 percentage points for 17-year-olds and up to 10 percentage points for 16-year-olds to achieve their targets.

• Twenty-eight out of 35 areas had identified their priority groups in the follow-up survey. These groups included some of the most vulnerable young people, including learners with learning difficulties and/or disabilities (LLDD), looked-after children (LAC) or care leavers and teenage parents.

2 All but one area responded to this question in the baseline survey. The data reported is based on the collective response from one SRG and individual responses from 24 LAs, giving a total of 25 responses. 3 A total of 26 areas responded to this question, at least in part, in the final survey. One SRG responded on behalf of their LAs; individual LA responses were received from all other LAs.

Page 8: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

7

Overview of activities and costs

• Overall, local areas were undertaking the greatest number of activities4 in priorities

one and two5; understanding the cohort and determining local priorities (61 activities out of a total of 225 activities across all six priorities). Areas were undertaking the lowest number of activities in priority three: ‘managing transitions and tracking’ (30 activities).

• The highest median level of expenditure by priority was priorities one and two: ‘understanding the cohort and determining local priorities’ (£42,692) and the lowest was priority three: ‘managing transitions and tracking’ (£15,228). Areas spent the most on individual activities categorised as ‘improving access’ under priority five with a median of £50,624.

• In contrast, areas spent the smallest amount of money on individual activities categorised as ‘understanding local needs’ (priority four), with a median of just £2500 and ‘support for vulnerable groups’ (priority three), with a median of £6125.

• In order to deliver these activities, typically local areas spent just over half of grant funding on LA staffing. The majority of the remainder was spent on school/college staffing and commissioning. Most used existing budgets to supplement the grant.

Measuring impact and value for money

• Areas found it difficult to measure impact within the evaluation timescales (October

2011 to April 2012). Providing costs on a clear and consistent basis was also a challenge.

• Firm conclusions about impact and value for money have been difficult to make. The evaluation team has provided their interpretation of the data based on assessment of the relevance, economy, efficiency, effectiveness and sustainability of RPA activity. In lieu of detailed evidence of impact, the evaluation team used the number of completed activities as a measure of success.

• Overall, areas were carrying out 225 activities to increase participation rates. Of these, 23 areas had completed at least one activity, accounting for over two-thirds of activities (150) across all six priorities.

4 The term ‘activities’ relates to an overarching aim or collection of tasks local areas undertook to enable them to increase participation. Areas typically undertook several activities in each priority. 5 For the purpose of the research, priority one and two activities are reported together because of the close link between the use of data to understand the cohort and then determine local priorities.

Page 9: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

8

Priorities one and two: Understanding the Cohort and Determining Local Priorities

• Within these priorities, in the final survey, the most commonly reported activities in

local areas related to identifying young people at risk, with 15 areas carrying out 25 activities.

• Local areas’ understanding of the cohort was much stronger than previously seen in the RPA trials. Evidence from the surveys shows that 21 out of 25 areas had now set goals for participation in 2013 and 2015.

• The surveys showed that half of the areas were undertaking activities to develop or amend their RPA plan or priorities.

Priority three: Managing Transitions and Tracking

• Survey responses showed that nine areas were undertaking 10 activities related to

‘managing transitions’ and there was much activity taking place to improve tracking (survey responses showed that 14 areas were undertaking 17 activities to improve tracking).

Priority four: Establishing Support Mechanisms

• The survey data showed that a small number of local areas were supporting

vulnerable groups of young people (with nine areas undertaking 14 activities). Activities included the creation of support mechanisms and packages for vulnerable groups, and developing a particular role to help support vulnerable young people.

Priority five: Identifying and Meeting Provision Needs

• The survey responses showed that 10 areas were undertaking 23 activities to develop

provision to meet local needs, including developing new provision or particular types of provision, e.g. Apprenticeships or other work-related learning routes.

Priority six: Communicating the RPA Message

• Most activity within this priority related to the ongoing dissemination of information and

awareness raising around RPA (with 22 areas undertaking 43 activities).

Page 10: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

9

1. Introduction

1. The DfE commissioned Isos Partnership and NFER to carry out an evaluation of the Raising the Participation Age (RPA) Locally-Led Delivery Projects (LLDPs) during 2011/12. Building on two previous evaluations of the RPA trials, the overall aim of the third evaluation was to explore the implementation of local RPA projects, to measure their impact and assess projects’ value for money. The research objectives were to develop a clear understanding of local areas’:

• processes to support RPA and to identify what works well;

• baseline position and their success measures for assessing change; and

• issues, barriers and solutions to achieving RPA. 2. This report summarises findings from the final survey carried out in April 2012, which measured local areas’ ambition, costs and success measures against their baseline survey response from October 2011. The evidence from 18 case study visits completed between December 2011 and March 2012 is presented in a separate report.

1.1 Background to RPA

3. The Education and Skills Act (2008) increased the minimum age at which young people in England can leave learning. From 2013, young people will be required to continue in education or training until the end of the academic year in which they turn 17 and from 2015 they will be required to continue until their 18th birthday.

4. RPA does not mean young people must stay in school; they will be able to choose one of the following options:

• full-time education, such as school, college or home education; • an Apprenticeship; or • full-time work with part-time education or training.

5. Under the Education and Skills Act (2008), local authorities (LAs) will be required to promote the effective participation in education or training of the young people in their area and make arrangements to identify young people not participating. Learning providers will be required to promote good attendance of 16- and 17-year-olds and inform local authority support services if a young person has dropped out of learning, so that the young person can be contacted swiftly and offered support.

6. Additionally, the Education and Skills Act (2008) places duties on employers who are employing young people full time6, where they are not providing accredited training themselves. These duties include checking the young person’s evidence that they are enrolled in part-time accredited learning, for the equivalent of a day a week, before they start work and agreeing reasonable hours of work so that the young person can access training

6 ‘Full time’ means for 20 hours or more per week, and for eight or more weeks in a row.

Page 11: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

10

elsewhere, for the equivalent of a day a week. On 2 July 2012 the Government announced that it would not be commencing these duties on employers in 2013.

7. There has already been a range of work to help local areas prepare for RPA. The previous phases of local work – the RPA trials – have already shown some of the steps that areas can take to prepare for full participation. Phase 1 (September 2009 – March 2010) focused on one of three specific themes: Information Advice and Guidance (IAG); re-engagement of 16- and 17-year-olds; and the development of area-wide local solutions. In Phase 2, (April 2010 – March 2011) four new LAs and another sub-region joined the programme and existing areas were asked to maintain a more in-depth focus on their specific trial models in order to establish best practice on implementation of RPA.

1.2 Focus of the LLDPs

8. This phase of RPA LLDPs is different from the previous trials. It focused on local determination of the challenges to be addressed and the actions local areas could take to develop their approaches to increasing participation in the run-up to 2013 and 2015. Areas were asked to identify their priorities and to develop and test specific approaches to address these, rather than focusing on a prescribed theme. A full list of the LAs leading projects is shown in Figure 1.1 below, including whether they had previously participated in RPA trials.

Figure 1.1 Local Authorities Leading RPA LLDPs

9. The other major difference between the LLDPs and the previous RPA trials is that the DfE identified six areas to act as ‘local leaders’. These areas have provided support and challenge to other local areas and led on disseminating learning (both regionally and nationally). There has been no National Participation Adviser involved with the LLDPs. The ‘local leader’ areas are highlighted in bold in Figure 1.1 above.

Page 12: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

11

1.3 Evaluation methodology

10. The evaluation team adopted a mixed-method, three-staged approach, which included:

• a baseline survey (September and October 2011);

• a follow-up survey (March and April 2012); and

• case study visits (December 2011 to March 2012).

1.3.1 Baseline survey

11. The aim of the baseline survey was to collect information from local areas on their local context and ambition; their activities and anticipated expenditure to increase participation; and how local areas planned to measure success and assess impact. The evaluation team piloted the survey data collection form with two LAs and one SRG and it was given Star Chamber approval during September 2011. Later the same month, the survey was despatched to 35 areas (comprising three SRGs (16 LAs) and 19 LAs). Respondents had four weeks in which to complete their data. During this time, the evaluation team sent two reminder emails and offered telephone support to respondents to discuss their survey returns and offer help in setting success measures. A 100 per cent response rate was achieved. The baseline survey data presented in this report is based on responses from three SRGs and 19 LAs.7 1.3.2 Follow-up survey

12. A follow-up survey was undertaken in March and April 2012. Building on the baseline survey, the follow-up survey explored: whether local areas’ ambition and trajectories had changed; areas’ actual expenditure (to inform which activities offered value for money); whether planned activities had been delivered or not and whether the activities had been successful in raising participation rates. The evaluation team piloted the follow-up survey with two LAs before despatch in March 2012. Areas were given five weeks in which to respond to the survey, during which time, the evaluation team offered support via email and telephone. Each survey was pre-populated using the baseline survey data as appropriate. A 100 per cent response rate was achieved. 1.3.3 Case studies

13. Isos completed 18 case study visits to local areas between December 2011 and March 2012. Through the case studies, the evaluation team explored local areas’ ambition and trajectories; explored activities and associated costs; and looked at the impact of the areas’ work. Case study areas represented a mix and spread of areas including those in urban and rural locations, small and large authorities, and one SRG. A separate report presents the case study findings. This report presents the survey findings only.

7 Throughout the report the term ‘area/s’ is used to refer to the survey returns of LAs and SRGs.

Page 13: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

12

1.4 Final reports and updated tools 14. At the end of the Phase 2 RPA trials the evaluation team developed the framework (see Figure 1.2 below) to draw together learning from the trials. This was intended to help other areas think about their preparations for RPA. Feedback from the LLDPs confirmed that areas have found this a useful framework for thinking about RPA planning. Some have used it as a guide for their own RPA plans, whilst others have used the content but have developed their own headings for themes of work. The evaluation has used the six priorities to allow areas to report on their activities in a consistent way. This final report is therefore structured around these headings.

Figure 1.2 Framework for Planning for Raising Participation:

15. This report presents evidence from the areas’ survey returns (both baseline and final surveys) about their ambition, activities, costs and success measures. Evidence gathered through case study visits about the activities being undertaken to support the different priorities is presented in a separate report.

16. At the end of Phase 2 of the RPA trials the evaluation team also produced a number of tools based around the RPA planning framework (see Figure 1.2 above). These tools have been updated on the basis of evidence from the LLDPs and can be found at the link below:

www.education.gov.uk/childrenandyoungpeople/youngpeople/participation/rpa/a0075564/rpa-past-projects

Page 14: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

13

2. Findings from the surveys This section presents the survey findings. The survey methodology is outlined in paragraphs 11 and 12 above.

2.1 Overall confidence levels

17. The evaluation team asked survey respondents to indicate how prepared they felt their area was in relation to the six RPA priorities8 identified during previous evaluations. At the point of the baseline survey, overall, local areas appeared confident about how prepared they were for achieving RPA as shown in Figure 2.1 below. At the start of the evaluation, areas indicated that they felt slightly better prepared for ‘understanding the cohort’ and ‘determining priorities’, but were less confident about being prepared for ‘establishing support mechanisms’. Figure 2.1 How prepared areas felt by priority in baseline and follow-up surveys

NB: N = 25 areas for baseline survey data based on one SRG and 24 LAs. One area did not respond to the baseline survey. N = 26 for follow-up survey data based on 25 LAs and one SRG.

8 These priorities are: Understanding the Cohort; Determining Local Priorities; Managing Transitions and Tracking; Establishing Support Mechanisms; Identifying and Meeting Provision Needs; Communicating the RPA Message.

Page 15: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

14

18. In the follow-up survey, the evaluation team asked local areas to what extent their confidence in achieving RPA had changed since October 2011. At the time of the survey, overall, local areas reported that they were more confident about achieving RPA than they were in October 2011. They reported being particularly confident about ‘communicating the message’ and ‘understanding the cohort’. Those areas that were less confident about ‘managing transitions and tracking’ were concerned that local decisions about the end of Connexions services would mean less resource available to support tracking. Figure 2.1 above presents the data. 2.2 Levels of ambition

19. At the point of the baseline survey, 16 areas provided participation projection targets for 16- and 17-year-olds until 2014/15 while 10 did not. Compared to Phase 1 and 2 areas, a greater proportion of Phase 3 areas had set targets at the point of the baseline survey. This is because many of these areas were doing it as part of their activity within their RPA projects. Figure 2.2 Participation projections for 16-year-olds for 2014/159

* based on 2010/11 data as no 2011/12 data provided. 20. For the follow-up survey, the evaluation team asked all local areas to provide their latest participation projection targets. Figure 2.2 above shows that, compared to the baseline survey, a further four areas had set targets for participation for 16-year-olds, while five areas still had not set projections. Of all the areas that have set targets for 2014/15, all but one area set a target of equal to or greater than 98 per cent for 16-year-olds.

9 Local areas were given an option for their named participation projection data to be published. The term ‘Unnamed LA’ refers to the local areas that chose not to present named data.

Page 16: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

15

21. Figure 2.3 below shows the level of ambition for 17-year-old participation rates. Overall, the level of ambition for 17-year-old participation was lower than that for 16-year-olds. In part this reflects lower starting points of participation for 17-year-olds in many areas.

Figure 2.3 Participation projections for 17-year-olds for 2014/15

* based on 2010/11 data as no 2011/12 data provided. 22. At the time of the baseline survey, 10 local areas had set targets for 17-year-olds of equal to or greater than 98 per cent. Two of these had a 100 per cent target. However, at the point of the follow-up survey, eight areas had targets of equal to or greater than 98 per cent yet no areas set a target of 100 per cent participation. 23. Where local areas amended their data from the baseline survey, the trend was to lower their overall ambition. Reasons given for this change included areas having a greater understanding of why some young people may not participate at any one time and as a result of further analysis or interrogation of their own data. The five areas that had not yet set projections were asked why they had not done this and provided the following reasons:

• targets not yet signed off locally (two areas);

• data analysis incomplete (one area);

• expected non-participants undefined (one area);

• not enough evidence to set meaningful targets (one area).

24. These local areas indicated that they plan to set targets using similar approaches to the areas that had already set targets; consultation with partners/services and data analysis of client caseload information systems (CCIS) were the most commonly cited methods.

Page 17: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

16

2.3 Priority cohort groups

Figure 2.4 Local areas’ priority cohort groups

25. Local areas were asked to specify their priority groups of young people within their NEET cohort. In the follow-up survey, 28 areas specified their priority groups, while seven did not. Where areas had set projections for participation for their priority groups, they were asked for their figures. Twenty local areas had set projection data for at least some of their target groups, with 16 areas providing projection data for all their target groups. Figure 2.4 summarises which target groups areas had focused their activities on. 26. A further 10 areas targeted other priority groups, including, for example: young people at risk of homelessness/homeless; young people entitled to free school meals (FSM); young carers; new arrivals; asylum seekers and young people from traveller communities. 27. For the areas that provided projected participation figures for their target groups, the evaluation team calculated the average increase in participation between 2011/12 and 2014/15 for the different target groups10. The data shows that local areas expected the largest percentage increase in participation by looked-after children or care leavers by 2014/15, compared with the other target groups, as shown below:

• looked-after children/care leavers (average of 14 percentage points increase in participation);

10 Average increase in participation rates was calculated for those target groups where more than five local areas had provided projected participation figures.

Page 18: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

17

• teenage parents/pregnant teenagers (average of 12 percentage point increase);

• young people working with youth offending service/teams (average of 10 percentage point increase);

• learners with learning difficulties/disabilities (average of eight percentage point increase);

• 17-year-olds (average of seven percentage point increase).

2.4 Achieving their targets

28. Local areas were asked whether they considered that achieving their participation projections for their target groups would enable them to reach their overall participation target. Of the 17 local areas that provided an answer, the majority (15) answered ‘yes’ to this question, while two areas answered ‘no’. These two areas explained that this was because their target groups did not account for 100 per cent of the young people NEET in their local area and, as such, achieving RPA needed a wider focus than just the target groups to ensure they met their overall participation projections. 2.5 Scale and range of activity

29. The nature of the LLDPs meant that local areas were required to select the activities that they thought best fitted with local priorities, rather than being required to focus on one particular area (as they had been in the previous RPA trials). One consequence has been to generate a wider variety and scale of activity as is shown by Table 2.1 below.

Page 19: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

18

Table 2.1 Summary of activities by local area and priority

Activity Number of areas11

Number of activities

Priorities one and two: Understanding the Cohort and Determining Local Priorities

Identifying young people at risk 15 25

Understanding local needs 12 16

Develop/amend RPA plan/local priorities 11 12

Project set-up/governance/evaluation 7 8

Priority total 22 61

Priority three: Managing Transitions and Tracking

Improve tracking 14 17

Manage transitions 9 10

Support for vulnerable groups 3 3

Priority total 19 30

Priority four: Establishing Support Mechanisms

Manage transitions 16 26

Support for vulnerable groups 9 14

Engage schools/partners 7 7

Improve access 3 3

Understanding local needs 1 1

Priority total 21 51

Priority five: Identifying and Meeting Provision Needs

Develop provision to meet local needs 10 23

Understanding local needs 7 7

Provision for vulnerable groups 5 7

Priority total 14 37

Priority six: Communicating the RPA Message

Dissemination and awareness raising 22 43

Develop resources 3 3

Priority total 22 46

OVERALL TOTAL (ALL PRIORITIES) 23 225 11 Analysis based on responses from 24 areas.

Page 20: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

19

2.6 Local areas’ expenditure

30. The median12 overall cost for RPA activities across the local areas was £140,375. This value includes RPA grant and local area funds. However, the variation by local area was considerable. For example, the lowest amount a local area spent was £50,000, while the highest amount was £2,595,674. Twenty-one local areas provided resource costs in sufficient detail to allow analysis of their grant spend. Overall, it can be seen that the average grant spend per local area was £72,146. The types of resources local areas spent the grant on are illustrated in Figure 2.5. As can be seen, local areas spent an average of £37,950 of their grant funding on staff costs. On average, areas spent more of their grant on this than the other resources combined, accounting for 53 per cent of the total grant spent. The next biggest expenditure was on school and college staffing (£9355). Smaller amounts of the grant were spent on other resources, ranging from an average of £7275 on other commissioned activities to £970 on Education Business Partnerships (EBP). Figure 2.5 Local areas’ grant expenditure

31. Eighteen of the 21 areas that provided details of how they spent their RPA grant funding also provided details of expenditure on RPA from other funding streams. In addition to the RPA grant funding, the median value of other expenditure by local areas was £66,000. As with grant funding, staff costs was the highest proportion of this other expenditure by the local areas. 12 A median is a type of average that considers the ‘middle’ value for a sample, and is calculated by taking the value halfway down an ordered list so that an equal number of responses lie above and below it. It has the advantage over a standard mean average that it is not skewed by one or two very high or very low outliers.

Page 21: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

20

2.7 Activity costs

32. Twenty-two of the areas provided activity cost data suitable for analysis, although not all of these undertook activities in every priority. Local areas’ cost data included both their own resource input and RPA grant funds. Looking at the total spend on each priority (see Table 2.2), expenditure was greatest for activities relating to ‘understanding the cohort and determining local priorities’ (a median across all areas of £42,692), ‘identifying and meeting provision needs’ (a median of £34,854) and ‘establishing support mechanisms’ (£34,143). Areas spent around half as much on each of the remaining two priorities: with a median spend of £15,228 on ‘managing transitions and tracking’ and £16,478 on ‘communicating the RPA message’. Eight areas reported costs that could not be attributed to any particular activity, with a median cost of this for these areas of £2008. However, the range of expenditure by different areas was large, with some areas investing a large amount of their own resources in RPA activities in addition to the grant. This is illustrated in the table below which includes spending from both the RPA grant and areas’ own resource investment. Table 2.2 Total costs by priority

Total spending by priority Minimum cost

Median cost

Maximum cost

Priorities one and two: Understanding the Cohort and Determining Local Priorities £2652 £42,692 £1,035,000

Priority three: Managing Transitions and Tracking £2000 £15,228 £61,200

Priority four: Establishing Support Mechanisms £680 £34,143 £2,223,430

Priority five: Identifying and meeting Provision Needs £4016 £34,854 £225,000

Priority six: Communicating the RPA Message £810 £16,478 £66,897

Other costs £75 £2008 £20,030

NB: The expenditure reported above includes areas’ own input and RPA grant funds.

33. Areas reported their expected expenditure in the baseline survey. As illustrated below in Figure 2.6, actual expenditure was very similar to anticipated costs in priorities one and two but slightly higher for priorities three, four and six. Actual spend was substantially lower for priority five. When looking at the median costs for all the activities within each priority (see Table 2.3), the highest median cost fell within priority five, relating to identifying and meeting provision needs (£24,950). While areas reported spending the greatest amount overall on priority areas one and two, the median cost of the individual activities that fell within these priority areas was more moderate (£15,008), and was similar to the median cost of activities in the other priority areas. This was a result of each area undertaking a larger number of small activities for priority themes one, two, three, four and six, compared to a smaller number of larger activities for priority area five.

Page 22: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

21

Figure 2.6 Expected versus actual spend by priority

34. Although expenditure varied substantially across areas, the median spend on each individual type of activity was fairly consistent, with median spending for nine of the 18 activities lying in the range £9000–£19,000. The three types of activity with the highest median cost were as follows:

1. improve access (which falls within priority four): £50,624; 2. develop provision to meet local needs (which falls within priority five): £42,284; 3. develop resources (which falls within priority six): £26,575.

35. Further detailed information on the costs of each activity for each priority is provided in Table 2.3 which shows that:

• Most areas’ spend on each of the activities for priorities one and two are moderately costed with the median for each type of activity ranging from £13,200 up to £23,041.

• The median cost of activities in priority three was the lowest across all priorities, being £11,358.

• The majority of activities in priority four were moderately costed, with the median across all activities in this priority being £12,520. However, this priority also included the highest costed activity across all priorities (£2,233,430 for an overarching approach to tracking and providing appropriate provision for young people with learning difficulties and disabilities). This highlights the broad scale of activities classified under the ‘support for vulnerable groups’ category, with the lowest costed activity in this category estimated as just £581.

• The median actual cost of an activity in priority five is high (£24,950), with many of the activities relating to setting up new provision for young people and particular vulnerable groups.

• The average amount spent on each activity in priority six was moderate, with a median of £13,413.

Page 23: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

22

Table 2.3 Total costs by activity (based on local area resource input and RPA grant funding)13

Activity Minimum cost

Median cost

Maximum cost

Priorities one and two: Understanding the Cohort and Determining Local Priorities (based on data from 21 areas and 57 activities)

Identifying young people at risk £2463 £23,041 £282,264

Understanding local needs £2123 £14,606 £153,670

Develop/amend RPA plan/local priorities £536 £15,784 £740,000

Project set-up/governance/evaluation £3425 £13,200 £121,452

All activities £536 £15,008 £740,000 Priority three: Managing Transitions and Tracking

(based on data from 16 areas and 26 activities)

Improve tracking £750 £10,258 £34,020

Manage transitions £3950 £25,315 £61,200

Support for vulnerable groups £2250 £6125 £10,000

All activities £750 £11,358 £61,200 Priority four: Establishing Support Mechanisms

(based on data from 20 areas and 44 activities)

Manage transitions £621 £10,439 £85,931

Support for vulnerable groups £2900 £19,010 £2,223,430

Engage schools/partners £769 £8940 £52,200

Improve access £581 £50,624 £100,667

Understanding local needs £2500 £2500 £2500

All activities £581 £12,520 £2,223,430 Priority five: Identifying and Meeting Provision Needs

(based on data from 14 areas and 22 activities)

Develop provision to meet local needs £5300 £42,284 £72,102

Understanding local needs £2167 £4016 £225,000

Provision for vulnerable groups £5550 £16,461 £36,206

All activities £2167 £24,950 £225,000 Priority six: Communicating the RPA Message

(based on data from 21 areas and 23 activities)

Dissemination and awareness raising £810 £13,413 £66,897

Develop resources £12,309 £26,575 £40,840

All activities £810 £13,413 £66,897

13 Note that the median cost presented in Table 2.3 is based only on areas undertaking each named activity. The figures are therefore not directly comparable to Table 2.2: Total costs by priority.

Page 24: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

23

2.8 Success measures

36. Generally, areas found it difficult to measure the impact of their activity during the evaluation period available (six months). For some areas, especially those that were new to the LLDPs, this was often because the activity they were pursuing was putting in place some of the underlying systems and processes they thought they would need for RPA but that would not necessarily have an immediate impact on young people. In other cases, though, it was because areas had not thought systematically enough about measuring impact at the start of the work and had not set up appropriate systems to collect feedback and evidence on the progress made by young people who were targeted by activity. 37. The final survey asked respondents to indicate whether or not they had completed their activities. For each activity, areas specified whether the activity was ‘complete’, ‘partially complete/still developing’ or ‘not complete’. It also asked areas to select a success measure for each activity from a given list of options and to indicate what evidence they had collected to measure change. The results are discussed below. 2.8.1 Activities areas have completed14

38. Overall, areas were carrying out 225 activities to increase participation rates. Of these, 23 areas had completed at least one activity, accounting for over two-thirds of all activities (150) across all six priorities, within:

• priorities one and two, just under three-quarters of activities were complete;

• priority three, half of the activities were complete;

• priority four, just over three-quarters of activities were complete;

• priority five, over half of the activities were complete;

• priority six, almost three-quarters of activities were complete.

39. Within priorities one and two, over two-fifths of activities (19) related to identifying cohorts or target groups. Areas evidenced success by having early identification tools in place and by the number of young people identified by them and supported into further learning. One example of the impact on young people, noted by one local area, was that they had identified 732 young people of which just over half (51 per cent) had been supported into further learning at the end of Year 11. Other local areas felt that identifying the cohort itself had not yet had an impact on the young people directly, and that future impact would depend on how the local areas used the information. 40. Overall for priority three, local areas evidenced success by tool(s) being in place. Fewer examples of evidence of impact on young people were provided for managing transitions compared with some of the other priorities. This was generally because areas felt that it was too early to report any impact. Where local areas were able to give examples these related to young people receiving additional transition information in schools. For example, one area

14 Local areas found it difficult to measure the impact of their activity during the evaluation period. This is discussed in greater detail in paragraph 36.

Page 25: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

24

stated that 77 young people had received additional one-to-one sessions targeted to their needs to help their transition to post-16. 41. For priority four, local areas were more able to provide examples of how their work related to establishing support provision had actually had an impact on outcomes for young people. Often this related to a small targeted group of young people to whom they gave support. For example, one local area explained that, due to their support activity, 44 young people out of a cohort of at-risk young people were now entering further education, a further three were entering employment with training, 18 were planning to stay in sixth form, five were attending a training provider and the remaining five were undecided. Information for one was not available. Another example included a local area reporting that four of 11 young parents NEET have now re-engaged in learning as a result of the support they received. 42. In relation to priority five, local areas measured success through improved provision. Local areas were able to provide numbers of young people who had been affected by the activities being undertaken. The numbers of young people engaged in new or improved provision ranged from four to 2000 young people. One area reported that their numbers of 16- to 18-year-olds starting Apprenticeships had increased from 359 to 404 over the course of a year. Similarly another area had seen 60 additional young people become apprentices as a result of their promotional work. 43. Within priority six local areas’ responses indicated that almost two-thirds of activities (20) had succeeded in increasing awareness of RPA. In some instances, local areas were able to provide examples of the number of young people activities had affected. For example, in one local area, 700 young people had attended events aimed at raising awareness, while another area reported that 15 young people had become Apprenticeship Ambassadors, who, through their new role, had provided approximately 250 young people with an increased awareness of Apprenticeships. 2.8.2 Activities that areas have not completed

44. At the time of the follow-up survey, 19 local areas had not completed any of their activities, accounting for just under a third of activities (71). There was little difference in the proportion of uncompleted activities across priorities. The main reasons for local areas not completing activities were areas wanting to develop activities further or to carry out further testing or evaluation work before the activities could be classified as ‘complete’. 45. Local areas were asked to indicate how they planned to measure success in the future. Local areas that had uncompleted activities planned to use similar sources of evidence as reported by the local areas that had already seen change. It is likely, therefore, that when activities have been completed and are embedded, these local areas will also have measures of success with supporting evidence in line with the other local areas. Some local areas also indicated they planned to continue tracking cohorts of young people who had been supported by activities in this phase of the work to see what happened to them when they made the transition to post-16 education in September. This should provide them with evidence of the impact of their work.

Page 26: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

25

3. Measuring value for money (VfM) 3.1 Approach to VfM 46. The wide range of projects and contexts for the LLDPs, coupled with the evaluation timeframe, meant that assessing impact and value for money (VfM) has been challenging for local areas and the evaluation team. The evaluation sought to explore VfM by looking at (1) the process and (2) the impact of local areas’ approaches to achieving RPA. On the first, the evaluation sought to explore how successfully an activity or strategy was organised, resourced and delivered. On the second, the evaluation explored the extent to which an activity or strategy was successful in meeting its short-term objectives and longer-term aims. The team aimed to answer these questions by applying the following questions:

47. In order to answer these questions, the follow-up survey asked local areas to provide costed details of the resources they had used, including the amount of their RPA grant they had spent on each resource and any contributions made by partner organisations not paid for by the local area. Local areas found it difficult to provide relevant and consistent data and to estimate their anticipated RPA activity spend (as collected during the baseline survey) and their actual spend (as collected during the follow-up survey). The research team therefore provided a VfM tool, guidance and training to local areas during the RPA workshops delivered during spring 2012. The guidance, tool and training helped local areas provide a better depth and detail to their data, and enabled a greater consistency in approach between local areas. This has enabled cost data from 22 areas to be included in the final analysis. 3.2 VfM conclusions

48. The previous sections of this report provide further detail around local areas’ expenditure by priority and activity. This section summarises the research team’s VfM assessment in relation to relevance, economy, efficiency, effectiveness and sustainability.

Page 27: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

26

Relevance

As would be expected, the majority of overall expenditure, including grant and partner costs, was on staff time to deliver RPA activities and work towards achieving RPA. Furthermore, with the majority of local areas undertaking activities within priorities one and two (understanding the cohort and determining local priorities), this suggests areas should be well placed to develop and build on these strong foundations for achieving RPA.

Economy

The data shows that, in addition to the grant expenditure, LAs recorded a median spend of £66,000 of their own funding. Furthermore, local areas accessed support from partners including schools, colleges, Connexions, local steering groups and other organisations. The estimated median value of partnership costs for activities was £16,156. Given that local areas’ median expenditure was £140,375, this suggests that partnership working was perhaps not being as well utilised as it could be in order to get the best VfM for RPA activities. However, many of the areas found this aspect of the survey difficult, and so it is likely that the lower estimates were often due to areas not being able to estimate the value of partnership working that was in fact taking place.

Efficiency and effectiveness

The extent to which young people were directly affected by the activities was difficult to measure. Local areas’ activities varied considerably, with some areas reporting individual activities having a direct reach to thousands of young people (for example, an entire Year 9 and Year 11 cohorts of young people receiving IAG); other examples of activities were only reaching small numbers of young people (for example, five young people receiving ‘roll on, roll off provision’) and other activities as having no direct impact. For example, one area undertook a review of the curriculum offer for 16-19 year olds, which itself will not have had an impact on any young people. However, the review should lead to changes in provision and better communication of what is on offer, which in turn will have an impact on young people. Examples of how activities directly had an impact on young people are given throughout the separate case study report.

Sustainability

Local areas have provided or sourced additional funds to support RPA activities over and above the grant funding supplied by DfE. This suggests both that local areas have planned their activities to meet their local needs so that they can increase participation and that they have a longer-term vision. Furthermore, the greatest proportion of activities were in priorities one and two (understanding the cohort and identifying local needs). In general, these types of activities have greater one-off set-up costs with comparatively lower ongoing costs. In addition, their potential impact continues to be beneficial in the longer term. Similarly, where local areas have invested in building relationships with partners, these are likely to be beneficial in the future.

Page 28: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

27

4. Conclusions and recommendations

49. The conclusions and recommendations presented below are based on the survey data only. Separate conclusions and recommendations, based on the case study data, are presented in the separate case study report. 50. The experience of the LLDPs has confirmed many of the lessons that were learnt during the Phase 1 and 2 RPA trials. At the end of the current round of projects most areas are more confident about their ability to deliver RPA activities than they were at the start and most have set ambitious targets for participation in 2014/15. The survey data also shows that areas have prioritised some of the most vulnerable cohorts of young people as their focus for RPA activities. These include learners with learning difficulties and/or disabilities, looked-after children or care leavers and teenage parents. 51. As in previous phases of RPA trials, areas have focused on putting in place the basic building blocks needed to deliver RPA. Areas were undertaking the greatest amount of activities in priorities one and two: understanding the cohort and determining local priorities. This was particularly true for areas that were new to the LLDPs. 52. Areas found it difficult to complete the costing data in the survey and set clear quantifiable success measures. Although the data about costs had improved in the final survey, the evaluation still faced a number of challenges collecting relevant and consistent data from all local areas on costs, inputs and outputs. As a result, firm conclusions about the VfM of different activities are difficult to make, though the evaluation has shown clearly the median levels of spend in different activity areas. 53. Areas have found it difficult to measure the impact of their activity in the time period available. For some, this was because their activity related to underlying systems and processes needed for achieving RPA, and so would not have an immediate impact on young people. In other cases, it resulted from areas not thinking systematically about measuring impact at the start of the projects and not setting clear and quantifiable success measures. This was coupled with many local areas not setting up systems to collect feedback and evidence on the progress made by young people who were targeted by a specific activity. 54. At the end of RPA Phase 2 trials, the evaluation team developed an overarching planning framework for RPA and accompanying tools. The experience of the LLDPs has confirmed that this overall framework has proved useful in helping areas to think through their RPA plans. There have also been a number of new developments and refinements though to the approaches taken in the LLDPs. The evaluation team have therefore produced an updated set of tools which will be published alongside this report, showing new evidence and examples from the LLDPs. The new tools can be found at the link below: www.education.gov.uk/childrenandyoungpeople/youngpeople/participation/rpa/a0075564/rpa-past-projects

Page 29: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

28

Recommendations 56. Based on the survey and case study evidence collected through this latest evaluation of RPA LLDPs, the research team makes the following recommendations for the DfE and local areas to consider:

• So that local areas can demonstrate the impact of their work more clearly, the DfE should consider providing even greater support at the start of the projects to try to help areas set measures of success and establish systems for collecting information about impact. Local areas also need to consider how they can set better measures of success in the timescales available to them.

• Areas that are in the early stages of preparing for RPA will want to ensure they have put in place the basic building blocks needed to deliver RPA. Evidence from the evaluation of the LLDPs suggests this means focusing on activities in priorities one and two; understanding the cohort and determining local priorities in particular, as they provide the foundations for further action to deliver RPA.

• Local areas should consider using the RPA tools to identify any gaps in their implementation and see how they can learn from others. The tools can be found at: www.education.gov.uk/childrenandyoungpeople/youngpeople/participation/rpa/a0075564/rpa-past-projects.

Page 30: The evaluation of the raising the participation age ......Research Report DFE-RR236b The evaluation of the raising the participation age locally-led delivery projects (RPA) 2011 to

Ref: DFE-RR236b

ISBN: 978-1-78105-142-9

© ISOS Partnership & National Foundation for Educational Research

October 2012