The figures below are the results of Ofsted inspection activity in-year. It’s important to note these grades are for apprenticeship provision. This means they are not specific to any one type of provider. It’s surprising how often this is misunderstood. As Dr Chris Jones (Ofsted), rightly reiterated when I shared these figures at an event last week, this only shows provision inspected during that 12 month period, so we need to take into account these grades represent inspections carried out on the basis of risk. That same risk profile also applies to all other education provision on Ofsted’s watch. In my next blog I’ll share a comparison with you, as this shows apprenticeships in stark contrast to other inspections reported each year and I think raises some interesting questions.
17/18 Ofsted graded 58% of apprenticeship provision was at least good
16/17 49% at least good
15/16 63% at least good
14/15 51% at least good
It seems difficult to argue this demonstrates a trend of improvement in the provision which falls into scope during each period. What is more useful to point out, is the reasons nearly half of apprenticeship provision inspected was not yet good, have remained fairly consistent during that time.
I believe there is a high risk, should we read the tea leaves and look to the Ofsted annual report for 2020/ 21, the % won’t have improved any further.
I’d like to explain my rationale for considering this to be a risk:
- Challenge of moving from apprenticeship frameworks to standards
- End-point assessment familiarity
- The champion or thief of quality apprenticeships are employers
- A perfect inspection storm
The challenge of moving from apprenticeship frameworks to standards has highlighted the need to improve the capability of some curriculum leads and/ or trainers to design programmes of learning. Over the last 5 years, I’ve had the privilege of working with hundreds of managers and trainer-assessors and this has been a consistent area of support. The reason is this. An apprenticeship framework can be very formulaic. It can be seen as a step by step path to move someone along a track of content. It’s the shorthand language of units and bullet points, where the sum of the parts can be lost when focusing on the detail underneath. This was in fact one of the issues raised in the Richard Review which triggered the reforms.
In contrast, an apprenticeship standard – brought in as a cornerstone of the reforms – does not seek to specify the steps, it focuses on the outcome – the qualified practitioner in their chosen occupation. It asks the deliverer and the employer to determine the best path of learning for apprentices, using their shared knowledge of the occupation and pedagogy. This shift in emphasis has created an understandable struggle for some.
As the scaffolding of a well understood step by step framework is removed, trainers have looked to others to replace the scaffolding; for example they’ve looked to the awarding bodies, to the end-point assessment organisations. We often hear ‘how can I know what to teach if I don’t know how it will be tested’? There’s some truth to this but, as you’ll see later on, this is also a concern. We have to ask why our training professionals look for guidance from elsewhere. Let me be absolutely clear; this is not a judgement of these individuals, it’s a recognition that much of apprenticeship provision (particularly at Level 2 and 3) has been built largely on assessment of NVQs. We’ve still much more to do to address this challenge.
I was pleased to see ETF seek to gather opinion on the Professional Standards for Teaching and Training. I very much hope those delivering apprenticeships will actively engage with this.
The introduction of end-point assessment is one of the biggest changes we’re experiencing as a result of the reforms. If I were to take a guess, when data is published on EPA outcomes, we’ll see a lower level of first attempt achievement than we may have expected. Lower, because it is still very new. Lower because our trainers are still getting used to the differences between an on-programme portfolio and an end-point showcase. Lower because some of the assessment plans are more challenging than is really necessary. Lower because we’re still getting to grips with what adequate preparation ought to be. I could go on. Whilst this is a concern and we need to act to improve it, I see it as teething problems.
I think there is a much greater risk and that is, as our collective knowledge of EPA grows, our thinking narrows. It narrows to the point where our curriculum becomes no more than a servant of the test. And should we think this can’t happen, then you only need to look as far as the concerns raised over SATs and GCSE preparation. We must safeguard against a reductionist approach as best we can, by learning from elsewhere in the education system. This is an opportunity for policy makers, target setters and inspection regimes to move beyond data as far as is realistic.
The champion and thief of quality apprenticeships are employers. Bear with me! The amount of time apprentices are with their employer is far higher than with the training provider. I don’t even like using the term ‘employer’. It implies training providers are dealing with a single person. They aren’t. The decision maker who signed the contract is unlikely to be the day to day mentor, unless it is a very small business. The mentor/ line manager is key to success. They always have been. Any training provider will be able to share with you the difference it makes to the apprenticeship when an employer is actively involved. I’ve noticed a much clearer recognition of this as the reforms have progressed. A little more caution over recruiting an apprentice where the employer may not deliver an appropriate level of off-the-job learning; doesn’t’ commit to reviewing progress; doesn’t provide an appropriate level of support. This recognition is good news for quality but perhaps not so good for hitting numbers targets for apprenticeships. Whilst my HE colleagues will struggle to recognise the point made earlier about programme design, I know many who are already coming up against the challenge of engaging line managers in the programme of learning. The business community has much to learn about how to support apprentices successfully, from the employers who really are the champions.
We have a perfect storm brewing. I like the proposed Education inspection framework very much. I understand there are those who are concerned about the challenges of implementing it successfully but overall, it strikes me that the direction of travel is good. When it launches in readiness for September, the focus for inspection is heading towards one where education providers will need to demonstrate the extent to which they successfully design and deliver a well-thought out curriculum and the rationale for having it. It moves away from an overemphasis on outcomes. If you were to take a look at this in the context of the the risks I have outlined, I think we might have a problem which we need to rapidly address. Not, I hasten to add to deliver what Ofsted wants, more that what Ofsted will look at, are areas for development in apprenticeship provision which will come into sharper focus.
Apprenticeships present us with an opportunity to revolutionise an education system that for too long has relied on a single track through A-level to university. Yet it can only do so if we are at the very least, in line with other parts of the education system in the quality offered. There is much for us to be positive about with the provision that is good.
I very much look forward to the day an annual report tells us that 80% of in-year inspected apprenticeship provision is at least good. Then we’re really motoring.
I look forward to your comments.
Apprenticeship providers who haven’t as yet been inspected by Ofsted can subscribe to insightQ’s self assessment module for free by following this link