In case you have been living under rock (and I don’t blame you if you have- its a grim old world out there) the government have implemented a new headline accountability measure called Progress 8 which ranks schools according to value-added instead of purely by attainment. The first national league tables to use this measure were realised last year but I think its fair to say not that many people engaged with it very much given all the questions hanging over the new 9-1 grade system. This is the first year in which the new grades have been used and now the first ‘proper’ results are in. I’ve had a quick browse of the Department of Education’s key figures so that you don’t have to. Here are the central themes:
- The rhetoric and the reality of the Ebacc continue to be at odds
Lord save us! Have you ever met someone serious in education who thinks narrowing the curriculum a good idea? Well, even if you have, the numbers still don’t stack up for the government. Ebacc continues to struggle with only 21% of pupils achieving it in addition to a drop in the number of applying for it. What is perhaps most striking that the entry rates among students with low prior attainment has in fact increased and the percentage achieving it has almost halved whilst the entry rates of students with higher prior attainment applying has decreased along with the drop in achievement. If this is about offering a rigor then you have to ask why the more capable students are rejecting it in favour of a different balance of subjects (presumably creative ones). If this policy is about fairness then you have to ask what the value is of forcing lower attaining pupils to take subjects they may not enjoy only to result in failure. All this represents a huge step backwards for this qualification and begs the question: how does the government hope to increase entry rates to 90% without seeing a dramatic hit to national attainment figures? It would also be interesting to know how free schools and converter academies would do in this brave new world. So, continuing to push this policy seems to be a massive political risk for any government that wants to use results to endorse their education offer. This is all while many of the institutional support for the policy wanes, as I’ve previously written about.
- Don’t listen to any nonsense about particular types of schools doing better than others.
People who have an interest in particular types of school organisation are likely to point to outliers to claim great success for their favoured brand of school. Do not fall or this trick.
The academies that have sprung up since 2010 have overwhelmingly been schools with a record of excellent prior attainment. They make up 70% of the current academy population. The relative increase in attainment between maintained schools and academies is minimal (0.1%) suggesting that the starting points of a school is a more potent factor. As a result, the success of academies is not likely to be explained by the government’s academisation policy itself.
The Department of Education have also been very clear that given the small size of free schools as part of the wider system very little can be concluded from these results about the policy; as a group there are simply too vulnerable to year-on-year volatility in a zero-sum Progress 8 data set.
The hypothesis that selective schools would do worse under the new value-added measure has proved false: grammars etc clocked in at a 0.46 vs -0.01 in non selective schools. What is interesting for those campaigning against grammar schools it that non-selective schools in areas where there are a lot of grammars have a considerably lower average score of -0.14 providing potential justification for the claim that grammar schools have a negative impact on the schools around them. It will be important to engage with this data if the policy rears its ugly head again: grammars are not the answer to system-wide improvement.
- Regional variation
I’ve blogged previously about the growing importance of this topic in the education world. The notable trend continues: London has done considerably better than coastal areas. In a sense this is a relief- inner city London success has been resilient to curriculum change that many worried would mean a step backwards. Hackney came in with an impressive 0.33 progress score and Tower Hamlets clocked a 0.21 score. (Shout out to everyone at the school I have the pleasure to work at for coming in the top 100 schools in the country– woop woop!) Richmond by contrast came in at 0.07… Interesting for all those recruitment people who complain about the ‘impossible task’ of convincing people to leave the inner London weighting: outer London performed better than inner London, but there is consider variation within that. I’m not saying recruitment is easy- but it does suggest that the crisis in the inner London vs outer London is more theoretical than real. Also, the South East had a negative Progress 8 score, so when industry folk talk about the success of “London and the South East” you have to question the inclusion of the latter category…
I’ll be using this data all year so expect more detail in future posts!