Analysis of Ofsted inspections for apprenticeship provision indicates we still have some way to go to ensure we are getting the basics right – regardless of a new inspection framework says Louise Doyle, director of Mesma.
An examination of the latest Ofsted full inspection reports (April – August 2019) reveals several interesting themes. In summary, the profile for apprenticeships delivered by colleges reveals 8 out of 10 received a grade 3 or below. The same applies for 17 of the 26 independent training providers or employer providers.
Under the reports’ leadership section, we are seeing an ambitious vision as a factor among those providers who are achieving good inspections – supported by leaders prepared to take decisive steps to facilitate change where needed. So far; so good.
However, where leadership is struggling, we can see some common threads: weak governance and external scrutiny having a recurring impact. We see leaders who are slow to bring about improvement where there are lower grades and a lack of quality assurance, including inaccurate self-assessment, improvement planning and ineffective use of data. Poor sub-contractor management is clearly evident in those providers judged to be inadequate.
Poor quality progress reviews, a lack of engagement between the employer, the trainer and the trainee feature in the majority of college’s receiving grade three and four for apprenticeship delivery.
It would be remiss of me not to mention the red flag of the moment; not using information gathered at the learner’s start point to inform the program in knowledge, skills and behaviours, and maths and English which is mentioned often. What I will continue to warn against is the risk that a basic check undertaken prior to an apprentice joining the programme to satisfy funding requirements will suffice ‘initial assessment’.
It reminds me of the days when the completion of learning styles questionnaires was routine yet served no meaningful purpose for students. I recall some heated debates with a previous senior leader I reported into about why we shouldn’t be doing them because they were pointless. I’m glad the research now backs this up. I hope he’s seen it.
When it comes to quality of education, learners receiving good teaching, learning and assessment and support to improve those areas where things are going well. Good assessment practice, targets and feedback are features of those higher-grade reports.
However, issues around consistency still prevail. Weakness of assessment practice, target setting and feedback feature year after year as issues we need to address to improve the quality of provision. The impact of poorly delivered English and Maths features frequently when the grades are lower which won’t be a surprise to any of us.
Bucking the trend
Turning to HE institutions, our universities appear to be bucking the trend that ITPs and FE colleges are experiencing in terms of grade profile. All but 1 of the 9 HEIs inspected were graded as ‘Good’. It is to be applauded that many university senior leaders have been able to clearly articulate the importance of apprenticeships to widening participation and strategic direction more generally.
However, it’s not all sunshine and light because we are seeing some elements of HE senior leadership also failing to have sufficient oversight of quality management.
I don’t think I’m being unfair in stating that some of this success is due to the high volumes of programmes being delivered in the health service, where supervision of new staff is part of the employer’s fabric. This isn’t a criticism; it’s evidence of the important role employers play in driving quality apprenticeship delivery.
Looking at the Ofsted reports also reminds us why providers should quality assure their quality assurance systems to ensure they are doing what you need them to do. Sometimes, it seems like there’s so much quality assurance activity going on that we’re patting ourselves on the back because we do it, rather than reflecting on whether that particular process has an impact. If you do it, how do you use the data to drive improvement? If it doesn’t, why do it? Our clients are often surprised at our quest to strip back their QA rather than pad it out.
So, regardless of the changes being brought in by the new Ofsted framework, it’s clear that there are still some fundamental basics that can be improved to drive improvement across the spectrum. As our colleagues at Ofsted have themselves said, this is not about dancing to the tune of a new framework. Yes, let’s understand the new process of inspection but it doesn’t really change what a good apprenticeship looks like does it?